Skip to content

Provider Obligations for AI Systems

The content specifically addresses obligations imposed on providers of high-risk AI systems, which is a distinct and important category of requirements that deserves its own dedicated topic for better organization and searchability.

provider obligations provider requirements AI provider duties manufacturer obligations system provider responsibilities pre-market requirements post-market obligations quality management

Overview

Legal Framework

Articles 16 through 29 of the EU AI Act establish the core obligations for providers of high-risk AI systems. The law requires providers to implement a comprehensive compliance framework before placing such systems on the market or putting them into service. Key mandates include establishing a risk-based quality management system (Article 17), conducting the applicable conformity assessment procedure (Article 19), drawing up detailed technical documentation (Article 18), and implementing a post-market monitoring system to collect data on the system's performance throughout its lifecycle (Article 61). Furthermore, providers must ensure their systems undergo relevant conformity assessment procedures, affix the CE marking, and register the system in the EU database.

Practical Application

These obligations are cumulative and must be integrated into the provider’s development and business processes. As indicated in Recital 81, providers already subject to sector-specific quality management obligations (e.g., under medical device or machinery regulations) must integrate the AI Act’s requirements into their existing systems. The practical application centers on demonstrable, documented processes. The technical documentation must be sufficiently detailed to allow for conformity assessment and market surveillance. The post-market monitoring system is not merely a feedback channel but an active vigilance system designed to identify and mitigate emerging risks, requiring a structured plan for data collection, analysis, and reporting. While definitive case law is absent, the expectation from regulatory guidance is that providers will take a proportionate, risk-based approach, with more rigorous processes for systems presenting greater potential harm.

Key Considerations

  • Integrate, Don't Duplicate: For providers in regulated sectors, the quality management system for AI should be an integrated component of the existing, sector-specific management system, not a separate, parallel structure.
  • Documentation is Foundational: The technical documentation and the declaration of conformity are not just compliance checkboxes but are critical legal documents that must be maintained, kept up-to-date, and made available to authorities upon request. Their adequacy will be a primary focus in any conformity assessment or enforcement action.

Laws (53)

View all 53

Guidance (5)

News (1)