Article 89
Monitoring actions
The content specifically addresses 'Monitoring actions' as a distinct topic under the AI Act, which encompasses systematic oversight procedures, compliance verification, and market surveillance activities that are not fully captured by existing more general monitoring topics.
The AI Act and Digital Services Act (DSA) establish distinct but related principles for monitoring. AI Act Recital 32 characterizes the use of real-time remote biometric identification by law enforcement as "particularly intrusive," creating a risk of a "feeling of constant surveillance." This implies a need for strict, purpose-bound monitoring of such high-risk AI systems to prevent fundamental rights violations. Conversely, DSA Recital 30 establishes a general prohibition against imposing de jure or de facto general monitoring obligations on providers of intermediary services. This prohibition is a cornerstone of the DSA's liability framework, though it does not preclude monitoring obligations in specific cases or pursuant to national authorities' orders.
These provisions create a bifurcated regime. For providers of high-risk AI systems under the AI Act, particularly in sensitive domains like law enforcement, the rationale of Recital 32 supports proactive, systematic monitoring for compliance with strict requirements and fundamental rights safeguards. For providers of intermediary services under the DSA, the default position from Recital 30 is a shield against obligations to generally monitor all user content to identify illegal material. Monitoring, if required, must be targeted and specific, such as acting on a valid court order or a specific notice about illegal content. The European Court of Justice's jurisprudence on the e-Commerce Directive's analogous provision (Article 15) informs this interpretation, consistently ruling against generalized monitoring obligations.
Guidelines on the use of facial recognition technology in the area of law enforcement
More and more law enforcement authorities (LEAs) apply or intend to apply facial recognition technology (FRT). It may be used to authenticate or to identify a person and can be applied on videos (e.g. CCTV) or photographs. It may be used for various purposes, including to search for persons in police watch lists or to monitor a person's movements in the public space. FRT is built on the processing of biometric data , therefore, it encompasses the processing of special categories ...
Guidelines on codes of conduct and monitoring bodies
Guidelines on processing of personal data through video devices
Guidelines on the territorial scope of the GDPR
A rundown of the fine on IAPP: https://iapp.org/news/a/a-rundown-of-the-greek-dpas-clearview-ai-fine-findings
> Do we need an Chief Privacy Officer, a Data Protection Officer, or do we need both?In the following article, I will examine the benefits of both roles, but I will also look at some of the challenges related to each of the roles and why these have impelled both Data Protection Officers and organisations to question what the ideal setup is for them.