The European parliament has blocked the extension of a law that allowed big tech companies to scan their platforms for child sexual exploitation, creating a legal gap that experts warn will lead to undetected crimes. The law, which was introduced in 2021 as a temporary measure, permitted companies to use automated detection technologies to identify child sexual abuse material, grooming, and sextortion.
The law's expiration on 3 April has created uncertainty for tech companies, as they are still liable to remove any illegal content hosted on their platforms under the Digital Services Act, but can no longer scan for such content. Google, Meta, Snap, and Microsoft have stated that they will continue to voluntarily scan their platforms for child sexual abuse material, despite the regulatory gap.
In a joint statement, the companies expressed disappointment at the EU's 'irresponsible failure' to maintain established efforts to protect children online. The European parliament has prioritized its work on legislation to prevent and combat child sexual abuse online, but has not offered a timeline for agreements or implementation.
Concerns Over Increased Abuse
Child protection advocates have warned that the lapse in legislation will likely trigger a steep fall in reports of child sexual abuse. A similar legal gap in 2021 saw reports of such material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) fall by 58% over 18 weeks.
'When detection tools are disrupted, we lose visibility that directly impacts our ability to find and protect child sexual abuse victims,' said John Shehan, vice-president at the NCMEC. 'When detection goes dark, the abuse doesn't stop.'
In 2025, the NCMEC received 21.3m reports of child abuse, including over 61.8m images, videos, and other files suspected of being related to child abuse. About 90% of these reports were related to countries outside the US.
Global Implications
The EU's decision to prohibit scanning will have ripple effects in other regions, child safety experts say. Many internet crimes are cross-border, with perpetrators sending illegal images to people or targeting children in other countries. 'Sextortionists' may also capitalize on the law change, Shehan said.
'The offender can be anywhere in the world, but they could have unfettered access to minors in Europe now that there's legal uncertainty around those safeguards and protections to identify when a child is being groomed,' Shehan said.
Privacy Concerns
Privacy advocates argue that big tech scanning messages for child abuse threatens fundamental privacy rights and data security for EU citizens. However, child safety experts say that the scanning technology uses machine learning to identify known images or videos of abuse and does not store any data.
'Blocking CSAM is not an evasion of privacy. Free speech does not include sexual abuse of children,' said Hannah Swirsky, head of policy and public affairs at the Internet Watch Foundation.
The EU has allowed tech companies to voluntarily scan messages for the detection of terrorist content under legislation adopted in 2021. 'The EU is effectively risking open doors for predators,' Swirsky said. 'If the EU is serious about protecting children online, then it needs to agree on a permanent legislative framework for safeguarding children and for enabling detection.'
Comments
Sign in to join the conversation
Sign InNo comments yet. Be the first to share your thoughts!