What are conformity assessments under the EU AI Act?
What they involve and when providers need to do them
TL;DR
This newsletter is about conformity assessments under the EU AI Act. It looks what these assessments are, who must carry them out, what they consist of and when they need to be carried out.
Here are the key takeaways:
Conformity assessments is a process for demonstrating that the requirements for high-risk AI systems have been fulfilled. This means showing compliance with the following:
Implementing a risk management system
Implementing data governance measures
Developing technical documentation
Ensuring transparency of the system and provisioning information to users
Ensuring human oversight of the system
Implementing measures for accuracy, robustness and cybersecurity
Providers of high-risk AI systems are responsible for carrying out conformity assessments. However, depending on the type of AI system, providers will need to carry out either the internal or external version of the conformity assessment:
The internal conformity assessment requires the provider itself to demonstrate compliance with the Act. The external conformity assessment requires a notified body to verify compliance with the Act.
There are two scenarios in which a conformity assessment must be carried out:
The provider wants to place the AI system on the EU market, in which case the assessment must be completed beforehand
When the system is substantially modified after being placed on the market, in which case a new conformity assessment must be completed
What are conformity assessments?
Conformity assessments is a process for demonstrating that the requirements for high-risk AI systems have been fulfilled.1
What are high-risk AI systems? I briefly explain this in Who are providers under the AI Act?:
The bulk of the AI Act focuses on high-risk AI systems. The criteria for such systems under Article 6 is essentially two-tiered, whereby an AI system is high-risk if it is either:
Integrated into products regulated by specific sectoral product safety regulations as listed under Annex I
Used for performing certain activities in certain sectors listed under Annex III (such as biometric recognition systems, emotion recognition systems etc)
If an entity builds an AI system that meets either condition, then that entity will be classed as a provider of a high-risk AI system.
And what are the requirements of high-risk AI systems? These are detailed in Section 2 of Chapter III of the AI Act, which include:
Implementing a risk management system
Implementing data governance measures
Developing technical documentation
Ensuring transparency of the system and provisioning information to users
Ensuring human oversight of the system
Implementing measures for accuracy, robustness and cybersecurity
The purpose of conformity assessments is to "ensure a high level of trustworthiness" before they are placed on the EU market or put into service in the EU. It emulates a key feature of product liability law in Europe which uses conformity assessments as a means to ensure that a product meets EU standards and therefore can be sold on the EU market. It essentially signals to consumers, regulators and other stakeholders that the product is of sufficient quality and is safe to use.
Who is required to carry out a conformity assessment?
Providers of high-risk AI systems are responsible for carrying out conformity assessments. A 'provider' under the AI Act are those entities that develop AI systems. I explain this in more detail in Who are providers under the AI Act?.
However, the type of conformity assessment process that these providers carry out depends on the type of AI system they are providing. More specifically, for the purposes of conformity assessments, the AI Act distinguishes between three different types of providers:
Providers of AI systems in the area of biometrics (as detailed in Annex III)2
Providers of AI systems in the areas of critical infrastructure, education, HR, assessing creditworthiness, predicting criminality, border control and administration of justice (as detailed in Annex III)3
Providers of AI systems regulated by specific sectoral product safety regulations (as detailed in Annex I)4
What does a conformity assessment consist of?
The AI Act contains two types of conformity assessment processes:
The internal conformity assessment requires the provider to demonstrate that:
A quality management system has been established
The technical documentation for the system complies with the requirements for the Act
The design and development process of the AI system and its post-market monitoring are consistent with the technical documentation
The external conformity assessment focuses on demonstrating similar aspects to the internal version (demonstrating compliance with the quality management and technical documentation requirements). However, the external version is more detailed and must be carried out by a notified body. Under the Act, a notified body is "a conformity assessment body notified in accordance with this Regulation and other relevant Union harmonisation legislation."7 Each EU Member State must establish a notified body within their jurisdiction and ensure that they have the necessary resources to carry out conformity assessments in accordance with the external procedure.8
Providers of AI systems in the area of biometrics have a choice between the external or internal conformity assessment.9 However, if the provider has not applied harmonised standards or common specifications to comply with the Act, then it must use the external conformity assessment.10
Providers of AI systems in the other areas listed in Annex III must follow the internal conformity assessment procedure.11 Providers of AI systems regulated by specific sectoral product safety regulations must follow the conformity assessment procedure under those regulations.12 However, the requirements of high-risk AI systems under the Act must be included in that assessment.
When do conformity assessments need to be carried out?
There are two scenarios in which a conformity assessment must be carried out:
The provider wants to place the AI system on the EU market, in which case the assessment must be completed beforehand
When the system is substantially modified after being placed on the market, in which case a new conformity assessment must be completed
Under the Act,13 a 'substantial modification' is a change made to the AI system not foreseen or planned in the initial conformity assessment and either:
Affects compliance with the high-risk requirements under the Act
Results in a modification to the intended purpose
However, there is an exception to this if the AI system continues to learn after deployment and the resulting change was pre-determined by the provider and is part of the technical documentation for the system.14
EU AI Act, Article 3.20.
EU AI Act, Article 43.1.
EU AI Act, Article 43.2.
EU AI Act, Article 43.3.
EU AI Act, Article 3(22).
EU AI Act, Article 31.
EU AI Act, Article 43.1.
EU AI Act, Article 43.2.
EU AI Act, Article 43.2.
EU AI Act, Article 43.3.
EU AI Act, Article 3.23.
EU AI Act, Article 43.4.