Intermediary Liability Framework under DSA
This topic is needed to comprehensively cover the broader intermediary liability framework under the DSA, of which mere conduit is one component, including the conditions, standards, and exemptions that apply to different types of digital services.
Overview
Legal Framework
The intermediary liability framework under the Digital Services Act (DSA) is primarily governed by its safe harbor provisions for mere conduit (Article 4), caching (Article 5), and hosting (Article 6) services. These articles establish conditional exemptions from liability for illegal content transmitted or stored by these intermediary services. The conditions for these exemptions are detailed in accompanying recitals. Recital 22 DSA clarifies that to benefit from the hosting exemption, a provider must act expeditiously to remove or disable access to illegal content upon obtaining actual knowledge or awareness of it, while observing fundamental rights like freedom of expression. Recital 53 DSA further specifies that notice-and-action mechanisms must allow for the submission of notices that are sufficiently precise and substantiated to enable an informed and diligent decision by the hosting service provider regarding the illegality of the content.
Practical Application
The framework creates a tiered system where liability protection is conditional on passive or reactive behavior, depending on the service type. For hosting services, the key interpretation from the recitals is that "actual knowledge" triggers a duty to act expeditiously. The standard for notices under Recital 53 is critical; a notice must contain specific information (e.g., URL, reason for illegality) to form a valid basis for action. Providers are not obligated to monitor generally or proactively seek facts indicating illegal activity, but they must establish functionally effective and user-friendly mechanisms to receive and process such notices. The requirement to observe fundamental rights means removal decisions must be proportionate and consider freedom of expression, potentially requiring human review or appeal mechanisms in borderline cases.
Key Considerations
- Implement Robust Notice-and-Action Procedures: Hosting service providers must establish clear, accessible channels for reporting illegal content and implement internal processes to assess notices against the "sufficiently precise and substantiated" standard of Recital 53 before taking action.
- Document 'Expeditious Action': Upon validating a notice that gives actual knowledge, document the timing and rationale for the removal or disabling action to demonstrate compliance with the "act expeditiously" requirement from Recital 22 and maintain a defense against liability claims.
- Balance Removal with Rights Preservation: Design content moderation decisions, especially automated ones, with safeguards to avoid over-removal. Provide clear reasoning for actions and accessible appeal procedures to uphold user rights to freedom of expression and information.
Laws (13)
Case Law (2)
Data Protection Commissioner v. Schrems and Facebook
Schrems I
Safe harbour: US public authorities are not required to comply with safe harbor principles. Decision 2000/520 specifies that safe harbor principles may be limited to the extent necessary to meet national security, public interest or law enforcement requirements, or statute, regulation or caselaw. Self-certified US organizations receiving personal data from the EU are thus bound to disregard safe harbor principles when they conflict with US legal requirements. Decision 2000/520 does not contain s
Data Protection Commissioner v. Schrems and Facebook
Schrems I
Necessity/proportionality: The Decision does not contain any finding regarding US rules intended to limit the interference when they pursue legitimate objectives such as national security, nor refer to effective legal protection against such interference. FTC procedures and private dispute resolution mechanisms concern compliance with safe harbor principles (against US organizations) and cannot be applied with respect to measures originating from the State. Moreover, the Commission found that if
News (3)
Danish SA Declares Use of Google Analytics Unlawful Without Supplementary Measures
The Danish Data Protection Agency has looked into the tool Google Analytics and its settings, and the terms under which the tool is provided. On the basis of this review, the Danish Data Protection Agency concludes that the tool cannot, without more, be used lawfully. Lawful use requires the implementation of supplementary measures in addition to the settings provided by Google.
Irish Data Protection Commissioner Fines Instagram EUR 405M for Children Privacy Violations
> The fine is the result of an investigation that began in 2020 and focused on the company’s processing of children’s personal data. Based on press reports, the investigation focused on children between the ages of 13 and 17 who were allowed to operate business or creator Instagram accounts. As a result, children’s phone numbers and email addresses were publicly accessible.
CNIL Proposes 60 Million Euros Fine Against French AdTech Company For Non-Compliance with GDPR
> The proposed fine follows complaints filed by privacy NGO ‘Privacy International’ against Criteo. […] Under the CNIL’s sanction procedure, Criteo has the right to respond to the report, both with respect to the alleged infringements and the proposed sanction.