Clym Logo

EU Digital Services Act (DSA)

AT flagATBE flagBEBG flagBGHR flagHRCY flagCYCZ flagCZDK flagDKEE flagEEFI flagFIFR flagFRDE flagDEGR flagGRHU flagHUIE flagIEIT flagITLV flagLVLU flagLUMT flagMTNL flagNLPL flagPLPT flagPTRO flagROSK flagSKSI flagSIES flagESSE flagSELT flagLT

Overview

The EU Digital Services Act (DSA), or Regulation (EU) 2022/2065, establishes a harmonized legal framework for digital services across the European Union. Enacted in October 2022, it replaces parts of the eCommerce Directive (2000/31/EC) and aims to create a safer, more transparent online environment. The DSA applies to providers of intermediary services, including hosting services, online platforms, and very large online platforms and search engines. It introduces due diligence obligations, clearer responsibilities for illegal content, and stronger user rights.

Regulation Summary

  • December 15, 2020 – European Commission proposes the Digital Services Act.
  • October 19, 2022 – Regulation (EU) 2022/2065 adopted.
  • November 16, 2022 – Enters into force.
  • February 17, 2024 – Becomes fully applicable across the EU.

  • Hosting services and intermediary platforms (e.g., web hosting, cloud storage, content-sharing platforms).
  • Online platforms connecting sellers with consumers.
  • Very large online platforms and search engines (with more than 45 million monthly active users in the EU).
  • Providers established outside the EU that target users in the EU.

  • Micro and small enterprises are exempt from some obligations (e.g., transparency reports) unless designated as very large platforms.
  • Exceptions for private communications (e.g., emails, direct messages).

  • Act on notices about illegal content through a structured Notice and Action mechanism.
  • Provide terms and conditions in plain language.
  • Ensure transparency in content moderation decisions.
  • Designate a legal representative in the EU (if not established there).
  • Cooperate with Digital Services Coordinators and law enforcement.
  • Due diligence obligations vary based on provider type: all intermediaries have baseline duties; hosting services and platforms have additional responsibilities; very large platforms face enhanced requirements.

  • Implement mechanisms for users to report illegal content.
  • Inform users about content removal decisions and provide redress options.
  • Disclose information on advertising practices (e.g., targeting criteria).
  • Avoid misleading or manipulative interface designs ("dark patterns"), as prohibited by Article 25.

  • Online platforms must enable traceability of business users.
  • Very large platforms must assess and mitigate systemic risks (e.g., disinformation, harm to minors).
  • Recommender systems must offer non-profiling-based options.
  • Annual independent audits for very large platforms.

  • Right to transparency in content moderation and advertising.
  • Right to contest content removal and algorithmic decisions.
  • Right to lodge complaints with the Digital Services Coordinator.
  • Protection from profiling for minors and sensitive data.

  • Regulatory Authorities: the National Digital Services Coordinators and European Commission (for very large platforms). 
  • Penalties: 
    • Fines up to 6% of global annual turnover for serious non-compliance. 
    • Fines up to 1% for providing incorrect, incomplete, or misleading information. 
    • Periodic penalty payments up to 5% of average daily worldwide turnover for ongoing violations. 
    • Suspension of services in serious cases.