CRBC News

EU Steps Back from Mandatory CSAM Scanning — Leaves Enforcement to Member States

The EU Council has adopted a compromise position on draft child-protection rules that requires platforms to assess CSAM risks and take preventive measures but stops short of mandating EU-wide automated scanning or removal. Enforcement will be handled by national authorities and companies may continue voluntary scanning after an April privacy exemption ends. The proposal would also create an EU Centre on Child Sexual Abuse; lawmakers and governments must now negotiate the final text.

EU Steps Back from Mandatory CSAM Scanning — Leaves Enforcement to Member States

European Union member states have agreed a joint position on proposed online child-protection legislation that stops short of imposing an EU-wide duty on technology firms to actively detect and remove child sexual abuse material (CSAM).

What the Council agreed

The European Council announced a compromise that requires online service providers to assess the risk that their platforms could be used to distribute CSAM or to solicit children, and to adopt appropriate preventative measures. However, the Council’s text delegates enforcement to national authorities rather than requiring uniform, mandatory scanning or removal obligations across the EU.

"Member states will designate national authorities ... responsible for assessing these risk assessments and mitigating measures, with the possibility of obliging providers to carry out mitigating measures. In the event of non-compliance, providers could be subject to penalty payments," the Council said.

Voluntary scanning and support structures

The draft also permits companies to voluntarily continue scanning content for child sexual abuse after an existing exemption from online privacy rules expires in April next year. The package would establish an EU Centre on Child Sexual Abuse to help member states implement the rules and provide assistance to victims.

Next steps

EU governments must now negotiate the text with the European Parliament, which in 2023 backed a stricter proposal that would have required messaging services, app stores and internet access providers to report and remove both known and newly discovered images and videos and to act on grooming cases. The Council’s softer approach is seen as a win for major U.S. tech firms and for privacy advocates who warned that mandatory scanning would threaten user privacy.

Reactions and wider context

Denmark’s justice minister, Peter Hummelgaard, welcomed the agreement, saying: "Every year, millions of files are shared that depict the sexual abuse of children. And behind every single image and video, there is a child who has been subjected to the most horrific and terrible abuse. This is completely unacceptable."

The European Parliament separately called on the EU to set minimum ages for social media access to address rising adolescent mental health concerns linked to excessive online exposure; that call is non-binding. Internationally, Australia is preparing what would be the world's first social media ban for children under 16, and Denmark and Malaysia have also announced plans for age-related restrictions.

As negotiations continue between EU institutions and member states, the final balance — between protecting children online and safeguarding privacy and free expression — will be central to the law that ultimately emerges.

Similar Articles