Chat control and E2EE
The EU's proposed online CSA regulation leaves the future of E2EE uncertain
TL;DR
This newsletter is about the EU's proposed Regulation laying down rules to prevent and combat online child sexual abuse, colloquially known as 'chat control' (May 2022 version). It looks at the scope of the legislation, the detection obligations it imposes on service providers and its potential incompatibility with end-to-end encryption.
Here are the key takeaways:
The EU's proposed Regulation applies to cloud storage providers, messaging platforms and internet access providers. The Regulation imposes obligations on these service providers to reduce the risk of child sexual abuse taking place, as well as to detect, report and remove or disable such activity.
Detection obligations are imposed on these service providers via detection orders issued via a detailed process provided in the legislation. It involves the completion of risk assessments, data protection impact assessments and court proceedings.
However, the Regulation itself does not specify the tools that should be used to comply with detection orders. Rather, the legislation sets out general requirements that technologies used to comply with detection obligations should adhere to.
Similarly, the operative provisions of the legislation do not directly address how service providers should comply with detection orders when using end-to-end encryption. Accordingly, the proposed Regulation itself does not, on its face, ban the use of end-to-end encryption.
However, end-to-end encryption may significantly hinder a service providers' ability to comply with a detection order. Given this reality, the proposed legislation may nudge service providers towards certain decisions, potentially breaking end-to-end encryption to comply with a detection order.
The EU Council was set to vote on the Regulation in June earlier this year, but scrapped this due to a lack of sufficient support. It seems that the EU plans to continue its work on the legislation later in 2024.
What is the EU's proposed online CSA regulation?
In 2022, the European Commission published its proposal for a new law for the detection and prevention of online child sexual abuse (CSA). The Regulation sets out rules imposed on online service providers to address online child sexual abuse (CSA) taking place on their platforms.
CSA here means both child sexual abuse material (CSAM) and the solicitation of children for sexual purposes, also known as grooming.1 The proposed Regulation its obligations on "relevant information society services", which encompasses providers of the following services:2
Hosting services. This means services providing the storage of information, such as cloud storage providers.3
Interpersonal communications service. This means services enabling direct interpersonal and interactive exchange of information via electronic communications networks, such as online messaging platforms.4
Software applications stores. This means intermediation services providing software applications.5
Internet access services. This means publicly available electronic communications services that provides access to the internet.6
The Regulation imposes obligations on these service providers to reduce the risk of CSA taking place, as well as to detect, report and remove or disable such activity. These obligations apply even to those providing their services with end-to-end encryption (E2EE).
Detection obligations under the online CSA regulation
The Regulation proposed by the European Commission does not itself set out specific detection obligations on service providers. Rather the proposal sets out a framework whereby detection obligations are imposed on service providers in the form of detection orders, subject to certain conditions.
This framework for detection obligations involves two authorities that are established by the Regulation:
Coordinating Authorities. Member States are required to designate at least one authority as "responsible for the application and enforcement of [the] Regulation."7 This includes the designation of a 'Coordinating Authority' that must contribute "to the effective, efficient and consistent application and enforcement of [the] Regulation throughout the Union."8
EU Centre on Child Sexual Abuse. This is a new EU Agency "supporting and facilitating the implementation of [the] provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online [CSA]." It must also "gather and share information and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combatting of [CSA], in particular online."9
With the involvement of these authorities, the framework for issuing detection orders consists of the following steps:
Service providers are first required to complete risk assessments that document the risk of their platforms being used for CSA purposes as well as the appropriate mitigation measures for addressing the identified risk.10 Some mitigation measures are specified in the Regulation, such as the adaption of content moderation systems, but service providers may enjoy some flexibility in deciding what measures are most appropriate for the risk identified in their assessments.11
These risk assessments must then be submitted to the relevant Coordinating Authority, which must consider the assessment and other factors in deciding whether to apply to a court (or another independent administrative authority) to impose a detection order on the service provider.12 In particular, the Coordinating Authority must consider that (i) their is evidence of a 'significant risk' of online CSA13 and (ii) the reasons for issuing the detection order outweigh any negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties.14
The rules around what should be included in a detection order are descriptive in nature; rather than outlining the specific measures or tools that should be used by a service provider, the Regulation merely specifies the objective that should be achieved, which is the installation and operation of technologies to detect CSA.15 To that effect, the Regulation also states that the order must include, among other things, the specific service to which the order applies (and even the part or component of service if this granularity is required) as well as whether the order concerns known or new CSAM or grooming.16
A draft of the request for a detection order must be shared with the EU Centre and the service provider for comment, and the service provider is offered the opportunity to develop an implementation plan detailing the measures to be taken to comply with the order as well as complete a DPIA (consulting with the relevant data protection authority where the conditions under Articles 35 and 36 of the GDPR are met).17 The service provider also has the right to challenge the detection order before the court or independent administrative authority issuing the order.18
The court or independent administrative authority considering the application for the detection order from the Coordinating Authority must ensure that a balancing test has been carried out by the Coordinating Authority to ensure that the obligations proposed in the order are necessary and proportionate. The Regulation specifies what should be taken into account for this balancing test.19
The diagram below sets out the full process for the issuing of detection orders under the Regulation:
Detection orders and E2EE
In the proposal, the European Commission claims to have taken a technology-neutral approach due to “the fast-changing nature of the services concerned and the technologies used to provide them.”20 Accordingly, the operative provisions of the proposed Regulation do not:
Identify the specific detection tools that should be used by service providers to comply with detection orders.
Directly address the use of E2EE in complying with detection orders.
In fact, on E2EE, Recital (26) of the proposal states the following:
...this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. (Emphasis added)
So how service providers are to comply with detection orders, including in E2EE environments, is left, ultimately, to the providers themselves.
As mentioned beforehand, the Regulation does not require detection orders to specify the technologies to be used to comply with a detection order, for example hash-based detection tools.
Furthermore, the provisions on detection technologies contained in the proposed Regulation are somewhat vague, and only outline the general requirements that the technologies should adhere to:21
Be effective for the purpose
Not able to extract information not strictly necessary for the detection of CSAM or grooming
Be in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ right to privacy, including the confidentiality of communication, and to protection of personal data
Limit the rate of errors to the maximum extent possible
Accordingly, the proposed Regulation itself does not, on its face, ban the use of E2EE. Rather, the legislation sets out general requirements that technologies used to comply with detection obligations should adhere to.
However, the specific requirements of detection obligations would be contained in detection orders that are issued by a court or another independent administrative authority in a Member State. Therefore, if that order obligates an interpersonal communication service provider to detect CSAM or grooming on its platform, then the use of E2EE may prevent such an obligation from being properly fulfilled.
Is the proposal ultimately incompatible with E2EE?
As already set out, the technology-neutral approach of the European Commission means that the proposed Regulation does not itself indicate (i) the technological specificities of how to carry out CSA detection in general, nor (ii) the technologies that should, or should not, be used for detection operations in the context of E2EE environments.
Rather, as evidenced by the process for completing risk assessments and the issuance of detection orders, these decisions are left to the service provider. It is therefore the service provider that is given the space to determine how to best comply with a detection order.
However, while the Regulation does provide this apparent flexibility, the technological reality for those service providers using E2EE is that the proposed legislation may nudge them towards certain decisions.
As presented above, providers using E2EE will have limited options to effectively deploy CSA detection tools. This is because E2EE prevents access to the content of communications and in some instances even certain metadata is not even accessible to service providers for CSA detection purposes, which I have written on previously.
So from a technical perspective, providers may have little choice but to explore potential client-side scanning solutions that attempt to scan communications before they are encrypted and transmitted. Yet, such a choice could be criticised as ‘breaking’ E2EE for this undermines the essential goal of keeping communications secure between users.
Therefore, looking at the provisions of the Regulation and the nature of the systems operated by certain service providers, there is the potential for the legislation to be used to issue detection orders the compliance of which may be incompatible with E2EE. It should be noted, however, that the premise of this argument is based on the possibility of an incompatible detection order being issued rather than the Regulation itself explicitly providing for an incompatible detection order.
Indeed, in issuing such an order, certain factors must be considered, including a balancing test to ensure its necessity and proportionality. Additionally service providers must only use detection technologies that, among other things, are in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ right to privacy, including the confidentiality of communication, and to protection of personal data.
So while the European Commission has not directly addressed the issue of E2EE in the operative provisions of the proposal, there are nevertheless provisions in the proposal that do pertain to such a technological measure. It is therefore reasonable to conclude from the text of the Commission’s proposed legislation that it acknowledges the potential for detection orders to have a significant (negative) impact on the use of E2EE, hence the safeguards that are included in the legislation to at least mitigate any negative impact.
What is the current status of the proposed legislation?
The proposed Regulation was scheduled to be voted on by the EU in June of this year. However, this vote was cancelled with one diplomat explaining that "the required qualified majority would just not be met."
According to a draft agenda for EU Council meetings, the vote seems to have been delayed to the second half of 2024. Patrick Breyer, who has advocated against the legislation, has been following it closely on his website.
In the meantime, many service providers will likely continue to argue against the proposed legislation. For example, Meredith Whittaker, President of Signal, has stated the following on the proposal:
There is no way to implement such proposals in the context of end-to-end encrypted communications without fundamentally undermining encryption and creating a dangerous vulnerability in core infrastructure that would have global implications well beyond Europe.
Let's see where this legislation ends up.
Digital Services Act, Article 3(g)(iii).
European Electronic Communications Code, Article 2(5). The proposed Regulation even includes within its scope services that provide interpersonal communication services merely as an ancillary feature intrinsically linked to another service (see Article 2(b) of the proposal).
Digital Markets Act, Article 2(14).
Regulation (EU) 2015/2120, Article 2(2).
Proposal for a Regulation laying down rules to prevent and combat child sexual abuse (May 2022), Articles 5(1) and 7(4).
The Regulation sets out what constitutes a significant risk with respect to both known CSAM and unknown CSAM and also grooming. See Article 7(5), 7(6) and 7(7) of the proposal.
Proposal for a Regulation laying down rules to prevent and combat child sexual abuse (May 2022), Article 8(1)(d) and (e).
Important topic, I am preparing a post about the chat control proposal myself