Europe’s child safety laws require collecting the data its privacy laws forbid

Europe's Child Safety Laws and Privacy Concerns

Summary: Europe's efforts to protect children online have collided with its privacy architecture. Key developments include the expiration of an ePrivacy derogation, a hacked age verification app, and the stalled CSA Regulation. These issues highlight the tension between child safety and data privacy regulations in the EU.

Europe’s child safety laws require collecting data that its privacy laws forbid. This conflict arose when:

  • The ePrivacy derogation allowing voluntary CSAM (Child Sexual Abuse Material) scanning expired on April 3 after a Parliament vote to reject its extension.
  • The new EU age verification app, announced April 15, was hacked in under two minutes.
  • The CSA Regulation, intended to mandate platform detection of CSAM in private messages, remains stuck in trilogue negotiations with a July deadline.

The Scanning Gap:

The ePrivacy derogation, introduced as a temporary measure, allowed platforms like Meta, Google, and Microsoft to voluntarily scan private messages for CSAM without violating EU privacy law. Its expiration leaves a gap, with researchers highlighting the vulnerability of the new age verification app in under two minutes of testing.

Trilogue Negotiations:

The CSA Regulation, formally known as the Child Sexual Abuse Regulation (CSA Reg) and nicknamed Chat Control, aims to replace the voluntary framework. Trilogue negotiations between the Parliament, Council, and Commission have been ongoing since 2022, with challenges arising from:

  • Privacy concerns: The Parliament opposes aspects of the regulation incompatible with the right to privacy of communications.
  • Child safety advocates' views: Organizations supporting child safety argue that ending voluntary scanning will make platforms ignore abuse material on their systems.

The proposed CSA Regulation would require platforms to use detection orders from a new EU Centre to scan messages for:

  • Known CSAM (using hash-matching technology)
  • New CSAM
  • Grooming behavior

Key modifications made by the Parliament include:

  • Rejecting scanning of end-to-end encrypted messages
  • Limiting detection to known material
  • Excluding real-time communications

The Council, currently led by a rotating presidency, continues to push for stronger law enforcement access provisions.