6 truths about the chat control debate
The reality of child safety and encryption
The infamous chat control proposal is back on the EU agenda. I have written about this legislation previously, which would require certain online platforms to detect and prevent instances of child sexual abuse activity taking place on their services.
There are clear dangers to E2EE and data protection with this proposal. But I think to navigate the debates entailed here, certain truths need to be acknowledged. I have thought of five pertinent ones.
Truth #1: Messaging apps are used for offences against children online
Research from the UKās Centre of Expertise on Child Sexual Abuse demonstrates how perpetrators tend to move their interactions with children to private messaging platforms.1 Those platforms that protect messages with end-to-end encryption (E2EE) makes it much more difficult to detect instances of grooming or when child sexual abuse material (CSAM) is being generated or shared. The use of E2EE messaging platforms for criminal activity is well-known by law enforcement, and is certainly one part of the threat vector for online CSA.
Truth #2: Messaging apps are one part of the threat vector
The research on CSA offences shows that such activity comprises of several stages, and encrypted messaging apps play a role in some of those stages. Perpetrators may start in public spaces first before moving their interactions with children to private spaces.2
Truth #3: Messaging apps are not the only intervention point
If messaging apps are only one part of the threat vector, then they are not the only point of intervention for detecting and preventing CSA offences. Other intervention points could be much more accessible, such as the more publicly available parts of social media platforms.3
Truth #4: It is hard to know the extent of the role messaging apps play in the threat vector
While encrypted messaging apps form part of the lifecycle, it is difficult to know exactly how involved they are. In particular, with the limited visibility on these apps due to the lack of access to message content (except for the sender and recipient), it is difficult to understand the prevalence of CSA on the platform. This is a crucial missing piece of information.
Truth #5: Targeted surveillance is not possible with E2EE
As has been acknowledged in human rights case law, in the context of encrypted messaging apps, removing encryption for one user effectively removes it for all users. E2EE involves a set up in which the service provider does not have the cryptographic keys to read the encrypted messages exchanged on its service. Giving the provider the ability to generate its own copies of the cryptographic keys to decrypt messages removes protection for all users, even those who are not a threat.4 It is essentially either encryption for all or encryption for none.
Truth #6: The outcomes of this debate will have stark impacts on other areas
Replace CSA with āterrorist contentā or āmisinformationā or any other online harm and the aforementioned truths and issues remain. If the law requires service providers to implement mechanisms detect and prevent one kind of harm, why just stop at one. Once the mechanism is there, the risk of function creep increases substantially.
Centre of expertise on child sexual abuse, āA new typology of child sexual abuse offendingā (March 2020), pp.15-16.
Rachel OāConnell, āA typology of child cybersexploitation and online grooming practicesā p.8.
See June 2023 report from the Stanford Internet Observatory.
Podchasov v Russia, App no. 33696/19 (ECHR, 13 February 2024), para. 77.