Right to Explanation
This topic is essential as it specifically addresses the fundamental right of individuals to receive meaningful explanations about how automated decisions affecting them are made, which is a critical transparency and accountability mechanism in both GDPR and AI Act frameworks.
Overview
Legal Framework
The "right to explanation" is primarily governed by Article 22 of the General Data Protection Regulation (GDPR), which regulates solely automated decision-making, including profiling, that produces legal or similarly significant effects concerning an individual. This provision establishes a prohibition on such processing, subject to specific exceptions. Critically, when an exception applies and automated decision-making is conducted, Article 22(3) mandates that the controller implement suitable safeguards, including at least the right to obtain human intervention, to express one's point of view, and to contest the decision. The right to meaningful information about the logic involved, as part of the broader transparency obligations under Articles 13-15 GDPR, is the foundation for a meaningful explanation.
Practical Application
The right to explanation is not a standalone right but is derived from the transparency provisions (Articles 13(2)(f), 14(2)(g), and 15(1)(h) GDPR) in the context of Article 22. The explanation must be sufficiently comprehensive to enable the data subject to understand the rationale for the decision and to effectively exercise their rights to human intervention and contestation. This means explanations must be meaningful, concise, and intelligible, avoiding overly technical jargon. While the GDPR does not mandate disclosure of complex algorithms or source code, the explanation should cover the key factors and criteria that led to the decision. The recitals of the AI Act reinforce this by linking transparency to the principle of human agency and oversight, emphasizing that individuals should be informed when they are subject to an AI system, particularly in high-risk contexts.
Key Considerations
- Integrate with Article 22 Safeguards: An explanation is a core component of the "suitable safeguards" required by Article 22(3) GDPR. It must be provided proactively (under Articles 13/14) when such processing is envisaged, and upon request (under Article 15).
- Focus on Understandability, Not Complexity: The obligation is to explain the "logic involved," not the algorithm itself. The explanation should address the significance of the processed data, the reasons for the specific outcome, and the general system functionality in an accessible manner.
- Distinguish Between Prohibition and Condition: Organizations must first determine if their processing falls under the Article 22 prohibition. Only if a specific exception applies (e.g., explicit consent, contract) does the obligation to provide explanations and other safeguards as a condition for lawful processing arise.
Laws (13)
Recital 10
Recital 27
Recital 93
Recital 59
Recital 27
Recital 10
Article 86
Right to explanation of individual decision-making
Article 86
Recht op uitleg bij individuele besluitvorming
Recital 107
Artikel 40
Uitzonderingen op verbod geautomatiseerde individuele besluitvorming
Article 22
Automated individual decision-making, including profiling
Article 22
Geautomatiseerde individuele besluitvorming, waaronder profilering
Case Law (1)
Guidance (23)
View all 23Versiegeschiedenis
guidelines uitvoeren overeenkomst
Richtsnoeren 06/2020 inzake de wisselwerking tussen de tweede richtlijn betalingsdiensten en de AVG
guidelines wisselwerking toepassing artikel 3 en hoofdstuk V AVG
Version history
Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility related applications
Guidelines on processing of personal data through video devices
Guidelines 06/2020 on the interplay of the Second Payment Services Directive and the GDPR
Guidelines on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR
Guidelines 03/2022 on Deceptive design patterns in social media platform interfaces: how to recognise and avoid them
Guidelines on deceptive design patterns in social media platform interfaces: how to recognise and avoid them
These Guidelines offer practical recommendations to social media providers as controllers of social media, designers and users of social media platforms on how to assess and avoid so-called 'deceptive design patterns' in social media interfaces that infringe on GDPR requirements. To this end, the EDPB recommends that controllers make use of interdisciplinary teams, consisting, among others, of designers, data protection officers and decision-makers. It is important to note ...
Guidelines 01/2022 on data subject rights - Right of access
Guidelines on data subject rights - Right of access
The right of access of data subjects is enshrined in Art. 8 of the EU Charter of Fundamental Rights. It has been a part of the European data protection legal framework since its beginning and is now further developed by more specified and precise rules in Art. 15 GDPR.
Guidelines 4/2019 on Article 25 Data Protection by Design and by Default Version 2.0 Adopted on 20 October 2020
Guidelines on data protection by design and by default
Guidelines 10/2020 on restrictions under Article 23 GDPR
Guidelines on restrictions under Article 23 GDPR
Guidelines 8/2020 on the targeting of social media users
Guidelines on the targeting of social media users
Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement
Guidelines on the use of facial recognition technology in the area of law enforcement
More and more law enforcement authorities (LEAs) apply or intend to apply facial recognition technology (FRT). It may be used to authenticate or to identify a person and can be applied on videos (e.g. CCTV) or photographs. It may be used for various purposes, including to search for persons in police watch lists or to monitor a person's movements in the public space. FRT is built on the processing of biometric data , therefore, it encompasses the processing of special categories ...
ARTICLE 29 DATA PROTECTION WORKING PARTY
Guidelines on transparency
Richtsnoeren 10/2020 met betrekking tot de beperkingen krachtens artikel 23 AVG
guidelines beperkingen rechten van betrokkenen
Versiegeschiedenis
guidelines doorgifte van persoonsgegevens tussen overheidsinstanties en -organen binnen en buiten de EER
Richtsnoeren 05/2022 voor het gebruik van gezichtsherkenningstechnologie in het kader van rechtshandhaving
guidelines gebruik gezichtsherkenning bij rechtshandhaving
Steeds meer rechtshandhavingsinstanties passen gezichtsherkenningstechnologie toe of zijn voornemens deze toe te passen. De technologie kan worden gebruikt om een persoon te authenticeren of te identificeren en kan voor video's (bijv. CCTV) of foto's worden ingezet, maar ook voor andere doeleinden, waaronder het opzoeken van personen op signaleringslijsten van de politie of het volgen van de bewegingen van een persoon in de openbare ruimte. Gezichtsherkenningstechnologie is gebaseer...
Richtsnoeren 3/2022 betreffende het herkennen en vermijden van misleidende ontwerppatronen in de interfaces van socialemediaplatforms
guidelines misleidende ontwerppatronen
Deze richtsnoeren bieden praktische aanbevelingen aan aanbieders van sociale media als verwerkingsverantwoordelijken van sociale media, ontwerpers en gebruikers van socialemediaplatforms, over het beoordelen en vermijden van zogenaamde 'misleidende ontwerp patronen' in de interfaces van sociale media die inbreuk maken op de vereisten van de AVG. Daartoe beveelt de EDPB aan dat verwerkingsverantwoordelijken gebruikmaken van interdisciplinaire teams, bestaande uit onder meer ontwerpers, func...
Richtsnoeren 4/2019 inzake artikel 25 Gegevensbescherming door ontwerp en door standaardinstellingen
guidelines privacy by design en default
Richtsnoeren 01/2022 over de rechten van betrokkenen Recht van inzage
guidelines recht op inzage
Het recht van inzage van betrokkenen is vastgelegd in artikel 8 van het Handvest van de grondrechten van de Europese Unie. Het maakt al sinds het begin deel uit van het Europese wettelijke kader voor gegevensbescherming en wordt nu verder ontwikkeld met specifiekere, preciezere regels in artikel 15 AVG.
Richtsnoeren 8/2020 betreffende de targeting van gebruikers van sociale media
guidelines targeting gebruikers sociale media
Richtsnoeren 05/2020 inzake toestemming overeenkomstig Verordening 2016/679
guidelines toestemming
News (4)
OLG Bamberg - 10 U 61/25 e
}}}} The court held that the mere automated creation of a score value does not trigger [[Article 22 GDPR]] unless it directly leads to a legally or similarly significant decision about the data subject.A court held that the mere automated creation of a score value by a credit information agency does not trigger [[Article 22 GDPR]] unless it directly leads to a legally or similarly significant decision concerning the the data subject. == English Summary ==== English Summary == === Facts ====== Fa
OLG Bamberg - 10 U 61/25 e
Holding }}}} The court held that the mere automated creation of a score value does not trigger [[Article 22 GDPR|Article 22 GDPR]] unless it directly leads to a legally or similarly significant decision about the data subject.The court held that the mere automated creation of a score value does not trigger [[Article 22 GDPR]] unless it directly leads to a legally or similarly significant decision about the data subject. == English Summary ==== English Summary == The data subject brought multiple
Is the AI Act caging ChatGPT and other General Purpose Artificial Intelligence systems?
> The growth of generative artificial intelligence systems has led EU lawmakers to focus on General Purpose AI in drafting the AI Act, which will set the framework governing artificial intelligence in the European Union. As previously reported, the EU Parliament has already broadened the definition of artificial intelligence for the purposes of the AI Act… The post Is the AI Act caging ChatGPT and other General Purpose Artificial Intelligence systems? appeared first on GamingTechLaw.
Quod erat demonstrandum? - Towards a typology of the concept of explanation for the design of explainable AI
> * We propose a framework for defining different types of explanations of AI systems. > * We contextualize current XAI discourses within the proposed framework. > * We highlight two broad perspectives for defining quality criteria for explainability. > * We discuss the relevance of our framework in light of current and upcoming AI regulation.