AI has to be explainable
A look at a recent CJEU case on automated decision-making and subject access requests
TL;DR
This newsletter is about a CJEU judgment on automated decision-making. It looks at the court's stipulations on the right of data subjects to an explanation of automated decisions concerning them, the exceptions to this right and the implications the judgment has on the development and deployment of AI systems.
Here are the key takeaways:
The court case concerns a mobile telephone operator which refused to give a data subject a phone contract on the basis of an automated credit assessment. The data subject challenged the decision, the proceedings of which required a decision from the Court of Justice of the European Union (CJEU).
Under the GDPR, when the personal data of a data subject is used for automated decision-making concerning them, that data subject has the right to receive from the data controller information about that processing. In particular, the data subject is entitled to receive "meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject."
In the case before the CJEU, two key questions were addressed:
Exactly what kind of information is a data subject entitled to under the GDPR regarding automated decision-making using their personal data?
How is the data subject's right affected if the information includes trade secrets or the personal data of a third party?
On the first question, the CJEU held that data subjects are entitled to "the procedure and principles actually applied in order to use, by automated means, the personal data" concerning the data subject, such as a credit profile. This information must be provided "in a concise, transparent, intelligent and easily accessible form."
On the second question, the CJEU held that the rights of the data subject and the rights of third parties, including trade secrets, need to undergo a balancing test. The respective rights and interests at issue must be balanced "with a view to determining the extent of the data subject's right of access" under the GDPR.
Facts of the Case
In Austria, a mobile telephone operator refused CK a phone contract on the basis of an automated credit assessment. That assessment, which was carried out by Dun & Bradstree (D&B), found that CK did not have sufficient financial creditworthiness.
CK brought the matter to the Austrian data protection authority which ordered D&B to disclose to CK meaningful information about the logic involved in the automated decision-making based on CK's personal data.
D&B brought an action against this decision before the Federal Administrative Court in Austria. It claimed that it did not need to disclose the information to CK due to a protected trade secret.
The Federal Administrative Court held that D&B had infringed the GDPR by not providing CK with the meaningful information she requested or by failing to give sufficient statement of reasons as to why it was unable to provide that information. In particular, the Court held that D&B had not provided CK with sufficient explanations to understand how the probability of her future behaviour ('score') was calculated using her socio-demographic data.
CK then went to the City of Council of Vienna to have the Court's decision enforced. But the Council rejected this enforcement application on the ground that D&B had complied with its obligation to provide information, despite the fact that the company had not provided any additional information after the decision was adopted.
The matter came before the CJEU which had to address two questions:
Under the GDPR, do data subjects have the right to require data controllers to provide an exhaustive explanation of the procedure and principles actually applied in order to use, by automated means, the personal data concerning that person with a view to obtaining a specific result, such as a credit profile?
If the controller believes that the information to be provided to the data subject contains data of third parties protected either by the GDPR or trade secrets, is that controller required to provide the information to a competent authority or court to balance the rights and interests at issue to determine the extent of the data subject's rights under the GDPR?
The right to an explanation regarding automated decision-making
The GDPR defines automated decision-making as decisions based on automated processing that produces legal effects concerning a data subject or similarly significant affects.1 But the wording of the provision actually precludes such processing by default:
The data subject shall have the right not to be subject to a decision based solely on automated processing... (Emphasis added)
Automated decision-making is only permitted under the GDPR if:
It is necessary for the performance of a contract
It is authorised by law
The data subject has given explicit consent to the processing
For processing to constitute 'automated decision-making' under the GDPR, all three of the following conditions need to be met:
A decision needs to be have been made
That decision has to have been based solely on automated processing or profiling
The automated decision needs to have produced either a legal effect or a similarly significant effect on the data subject
A decision here can be an action or stance taken regarding a data subject that has a binding effect on them.2 Refusing to give a person a mobile telephone contract, as was the case with CK, certainly falls within this.
For a decision to be based solely on automated processing, the decision must be enforced without any human input. In CK's case, the decision to refuse her a phone contract was based only on her automated credit assessment, without any other human intervention.
A decision that produces 'legal effects' is one that affects a person's legal rights or status. This could be the cancellation of a contract,3 and also, as with CK, the denial of a contract.
Article 22.1 also includes in its definition of automated decision-making the prospect of 'profiling', which also has a specific definition under the GDPR:
...any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.4
So to apply this all the CK's case:
The controller, a mobile telephone operator, made a decision to deny CK a telephone contract
That decision was made on the basis of an assessment of CK's credit worthiness, which involved profiling CK based on personal data
The decision resulted in CK being denied a telephone contract with the company
Under Article 15.1(h) GDPR, if a data subject's personal data is subject to automated decision-making, they have a right to obtain from the data controller, as part of a subject access request (SAR), "meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject."
So one of the key questions for the CJEU in this case was exactly what kind of information a data subject is entitled to under this provision. What explanation does a subject have a right to under the GDPR when it comes to automated decision-making?
The CJEU answered this question by looking at the context and objectives of the rules under the GDPR, and how both need to be accounted for in the interpretation of those very rules.5
Context of the rules under the GDPR
CJEU stated that, looking at the different language versions of the provision, "the wording of Article 15(1)(h) of the GDPR covers all relevant information concerning the procedure and principles relating to the use, by automated means, of personal data with a view to obtaining a specific result."6
But regard must also be had to the whole provision of Article 15(1)(h), which states that meaningful information about the logic involved must be provided as well as "the significance and the envisaged consequences of such processing for the data subject."7
Other provisions should be taken into account too, including Article 12.1 which requires information to be provided in a concise, transparent, and this applies to information provided as per Article 15.8
So overall:
The examination of the context of which Article 15(1)(h) of the GDPR forms part thus supports the interpretation that emerges from the analysis of the wording of that provision, according to which ‘meaningful information about the logic involved’ in automated decision-making, within the meaning of that provision, covers all relevant information concerning the procedure and principles relating to the use of personal data with a view to obtaining, by automated means, a specific result, the obligation of transparency also requiring that that information be provided in a concise, transparent, intelligible and easily accessible form.9
Objectives of the rules under the GDPR
As has been held before by the CJEU, the objective of the GDPR is to ensure a high level of protection of fundamental rights and freedoms, in particular the right to data protection as per Article 16 Treaty on the Functioning of the European Union and Article 8 of the EU Charter,10 Furthermore, Recital (11) GDPR states that the purpose of the GDPR is to strengthen data subject rights.
CJEU case law has shown that subject access requests (SARs) can enable the data subject to ensure that their personal data are correct and processed lawfully.11 SARs are therefore necessary to exercise other rights.12
Accordingly, Article 15.1(h) is about enabling data subjects to exercise their rights under Article 22.3. Those rights include the right to obtain human intervention by the controller, for the data subject to express an opinion and contest decisions based on automated processing.13
But in order for data subjects to be able to exercise these rights, data subjects need to know the reasons why a decision about them was made and to have this reasoning clearly explained to them. This is consistent with Recital (71) which indicates that data subjects have a right to such an explanation.14
Accordingly:
It is apparent from the examination of the purposes of the GDPR and, in particular, those of Article 15(1)(h) thereof that the right to obtain ‘meaningful information about the logic involved’ in automated decision-making, within the meaning of that provision, must be understood as a right to an explanation of the procedure and principles actually applied in order to use, by automated means, the personal data of the data subject with a view to obtaining a specific result, such as a credit profile. In order to enable the data subject effectively to exercise the rights conferred on him or her by the GDPR and, in particular, Article 22(3) thereof, that explanation must be provided by means of relevant information and in a concise, transparent, intelligible and easily accessible form.15
Importantly, the CJEU clarified that providing this meaningful information does not mean merely communicating "a complex mathematical formula, such as an algorithm, or...a detailed description of all the steps in automated decision-making."16 This is because such pieces of information are neither concise nor intelligible for the data subject.
Instead, to satisfy the requirement to provide meaningful information, the controller must "describe the procedure and principles actually applied in such a way that the data subject can understand which of his or her personal data have been used in the automated decision-making at issue."17 The complexity of the automated processing carried out by the controller cannot be an excuse to provide an explanation of its decisions regarding the data subject.
The CJEU also made two further important points:
It is sufficient for the controller to inform the data subject "the extent to which a variation in the personal data taken into account would have led to a different result."18
Data subjects are also entitled to receive any profiles generated as part of the automated processing, since personal data generated by the controller itself is within the scope of the SAR.19
The CJEU's answer to the first key legal question
...Article 15(1)(h) of the GDPR must be interpreted as meaning that, in the case of automated decision-making, including profiling, within the meaning of Article 22(1) of that regulation, the data subject may require the controller, as ‘meaningful information about the logic involved’, to explain, by means of relevant information and in a concise, transparent, intelligible and easily accessible form, the procedure and principles actually applied in order to use, by automated means, the personal data concerning that person with a view to obtaining a specific result, such as a credit profile.20
Exceptions to the right to information
The right to the protection of personal data is not an absolute right. Recital (4) GDPR states that data protection must be balanced with our fundamental rights, including those rights and freedoms in the EU Charter. Furthermore, Recital (63) states that SARs should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property.
But the restrictions imposed on data protection when balancing other rights and freedoms must be done so in accordance with the principle of proportionality.21 This means that a data controller cannot just refuse all information to a data subject requested under a SAR. Therefore, the "means of communicating personal data that do not infringe the rights or freedoms of others should be chosen."22 This might mean a partial disclosure of the information requested.23
But to carry out this balancing test and determine how the right of access under Article 15 GDPR should be implemented, courts or competent national authorities may need to access the information concerned. This includes the personal data of third parties or trade secrets.24
Additionally, the CJEU held that the GDPR precludes Member State rules that exclude "the data subject’s right of access, provided for in Article 15 of the GDPR, where such access would compromise a business or trade secret of the controller or of a third party." This contravenes the requirement that the balancing of SARs and trade secrets needs to be considered on a case-by-case basis; such a rule provides the result of such a balancing test in every case, i.e., the rule presumes that if a SAR concerns a trade secret, then the SAR cannot be honoured.25
Accordingly, in answering the second question, the Court held:
...Article 15(1)(h) of the GDPR must be interpreted as meaning that, where the controller takes the view that the information to be provided to the data subject in accordance with that provision contains data of third parties protected by that regulation or trade secrets, within the meaning of point 1 of Article 2 of Directive 2016/943, that controller is required to provide the allegedly protected information to the competent supervisory authority or court, which must balance the rights and interests at issue with a view to determining the extent of the data subject’s right of access provided for in Article 15 of the GDPR.26
Thoughts on the case
If a decision based on the output of an automated system cannot be meaningfully explained, then that system may not be compliant with the GDPR. This includes AI systems used for automated decision-making; the deployment of such systems ought to involve effective human oversight.
In fact, this is something explicitly required under the EU AI Act regarding high-risk AI systems. Under Article 14 of the Act, some of the relevant provisions here include the following:
High-risk AI systems must be "designed and developed in such a way, including with appropriate human-machine interface tools, that they can be effectively overseen by natural persons during the period in which they are in use."27
These oversight measures should be appropriate the relevant risk, level of autonomy and use of the system, and either be built into the system by the provider or implemented by the deployer.28
Providers need to ensure that deployers of the system can properly understand the limitations of the system whilst being able to detect and address anomalies, dysfunctions and unexpected performance.29
Providers also need to enable deployers to correctly interpret outputs taking into account the interpretation tools and methods available.30
But even if an AI system is not high-risk, there are the transparency obligations under Article 50:31
People need to know that they are interacting with an AI system32
That information needs to be provided in a clear and distinguishable manner33
This is without prejudice to other transparency obligations laid down in Union law, such as the GDPR34
The decision from the CJEU emphasises the importance of explainable AI systems. And this reflects an important aspect of the obligations imposed on providers and deployers as per the AI Act. Sharing complex mathematical formulas or algorithms will not suffice. Providers of AI systems need to ensure that humans can actually interpret and explain the outputs.
But the more complex the system that is being used for automated decision-making, the more difficult it may be to comply with this explainability obligation. I have previously written about the troubles with modern AI models and being able to control how they behave. If developers cannot understand, predict or control these massive complex models, then explaining their outputs to data subjects seems highly improbable. This might dissuade companies from relying on AI systems wholesale for certain tasks or processes.
Furthermore, the stipulations from the CJEU could be highly relevant to AI agents. If an agent makes decisions that produce legal effects or similarly significant affects on a person, then the rights of the data subject, including the right to an explanation and to contest the decision, come into play. Though this might depend on the level of autonomy afforded to the agent and the given context.
Lastly, the stipulations on trade secrets are interesting. The CJEU did not attempt to carry out the balancing test itself, leaving this to Member State courts and competent authorities (such as data protection authorities). But what the CJEU did confirm is that trade secrets do not, by default, trump the rights of data subjects. There needs to be a balancing of the two. So trade secrets cannot be automatically invoked at the outset to escape the obligations owed to data subjects.
GDPR, Article 22.1.
Kuner et al (eds), The EU General Data Protection Regulation (GDPR): A Commentary (OUP 2020), p.532.
Article 29 Working Party, 'Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679' (2018), p.21.
GDPR, Article 4.4.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 39.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), paras. 40-43.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), paras. 44-45.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 48.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 50.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 51. See also Case C-446/21, Schrems (Disclosure of data to the general public) (4 October 2024), para. 45.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 55.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 54.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 55.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), paras. 56-57.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 58.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 59.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 61.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 62.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 64.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 66.
GDPR, Article 23.1.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 72.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 73.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 74.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 75.
Case C-203/22, CK v Magistrat der Stadt Wien (27 February 2025), para. 76.
EU AI Act, Article 14.1.
EU AI Act, Article 14.3.
EU AI Act, Article 14.4(a).
EU AI Act, Article 14.4(c).
However, it should be noted that the obligations under Article 50 only apply to AI systems intended to interact directly with natural persons, AI systems capable of generating synthetic audio, image, video or text content and AI systems used for emotion recognition and biometric categorisation.
EU AI Act, Article 50.1.
EU AI Act, Article 50.5.
EU AI Act, Article 50.6.