Digital Service Act

What is the Digital Services Act?

On 02/23/2024 the European Commission declared on its website what the Digital Services Act means. Our editorial team thinks that it is important for our readers to know the EU definition, so we publish it.

What is the Digital Services Act?

The Digital Services Act (DSA) regulates the obligations of digital services, including marketplaces, that act as intermediaries in their role of connecting consumers with goods, services, and content.

It better protects users by safeguarding fundamental rights online, establishing a powerful transparency and accountability framework for online platforms and providing a single, uniform framework across the EU.

The European Parliament and Council reached a political agreement on the new rules on 23 April 2022 and the DSA entered into force on 16 November 2022 after being published in the EU Official Journal on 27 October 2022.

The Digital Services Act is a Regulation that is directly applicable across the EU. Some of the obligations for intermediaries include:

  • Measures to counter illegal content online, including illegal goods and services. The DSA imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers’ to identify and remove illegal content;
  • New rules to trace sellers on online marketplaces, to help build trust and go after scammers more easily; a new obligation for online marketplaces to randomly check against existing databases whether products or services on their sites are compliant; sustained efforts to enhance the traceability of products through advanced technological solutions;
  • Effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions based on the obligatory information platforms must now provide to users when their content gets removed or restricted;
  • Wide ranging transparency measures for online platforms, including better information on terms and conditions, as well as transparency on the algorithms used for recommending content or products to users;
  • New obligations for the protection of minors on any platform in the EU;
  • Obligations for very large online platforms and search engines to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audits;
  • A new crisis response mechanism in cases of serious threat for public health and security crises, such as a pandemic or a war;
  • Bans on targeted advertising on online platforms by profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation;
  • Enhanced transparency for all advertising on online platforms and influencers’ commercial communications;
  • A ban on using so-called ‘dark patterns’ on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make;
  • New provisions to allow access to data to researchers of key platforms, in order to scrutinise how platforms work and how online risks evolve;
  • Users have new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Now, representative organisations are able to defend user rights for large scale breaches of the law;
  • A unique oversight structure. The Commission is the primary regulator for very large online platforms and very large online search engines (reaching at least 45 million users), while other platforms and search engines will be under the supervision of Member States where they are established. The Commission will have enforcement powers similar to those it has under anti- trust proceedings. An EU-wide cooperation mechanism is currently being established between national regulators and the Commission;
  • The liability rules for intermediaries have been reconfirmed and updated by the co- legislator, including a Europe-wide prohibition of generalised monitoring obligations.


Does the Digital Services Act define what is illegal online?

No. The rules in the DSA set out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for very large online platforms and search engines on how illegal content spreads on their service.

What constitutes illegal content is defined in other laws either at EU level or at national level – for example terrorist content or child sexual abuse material or illegal hate speech is defined at EU level. Where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.

What rules preceded the Digital Services Act, and why did they have to be updated?

The e-Commerce Directive, adopted in 2000, has been the main legal framework for the provision of digital services in the EU. It is a horizontal legal framework that has been the cornerstone for regulating digital services in the European single market.

Much has changed in more than 20 years and the rules needed to be upgraded. Online platforms have created significant benefits for consumers and innovation, and have facilitated cross-border trading within and outside the Union and opened new opportunities to a variety of European businesses and traders. At the same time, they are abused for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They pose particular risks for users’ rights, information flows and public participation. In addition, the e-Commerce Directive did not specify any cooperation mechanism between authorities. The “Country of Origin” principle meant that the supervision was entrusted to the country of establishment.

The Digital Services Act builds on the rules of the e-Commerce Directive, and addresses the particular issues emerging around online intermediaries. Member States have regulated these services differently, creating barriers for smaller companies looking to expand and scale up across the EU and resulting in different levels of protection for European citizens.

With the Digital Services Act, unnecessary legal burdens due to different laws were lifted, fostering a better environment for innovation, growth and competitiveness, and facilitating the scaling up of smaller platforms, SMEs and start-ups. At the same time, it equally protects all users in the EU, both as regards their safety from illegal goods, content or services, and as regards their fundamental rights.

What is the relevance of the Regulation of intermediaries at global level?

The DSA is an important step in defending European values in the online space. It respects international human rights norms, and helps better protect democracy, equality and the rule of law.

The DSA sets high standards for effective intervention, for due process and the protection of fundamental rights online; it preserves a balanced approach to the liability of intermediaries, and establishes effective measures for tackling illegal content and societal risks online. In doing so, the DSA aims at setting a benchmark for a regulatory approach to online intermediaries also at the global level.

Do these rules apply to companies outside of the EU?

They apply in the EU single market, without discrimination, including to those online intermediaries established outside of the European Union that offer their services in the single market. When not established in the EU, they have to appoint a legal representative, as many companies already do as part of their obligations in other legal instruments. At the same time, online intermediaries also benefit from the legal clarity of the liability exemptions and from a single set of rules when providing their services in the EU.