Digital Services Act: agreement for a transparent and safe online environment


Parliament and Council reached a provisional political agreement on the Digital Services Act (DSA). Together with the Digital Markets Act, the DSA will set the standards for a safer and more open digital space for users and a level playing field for companies for years to come.

EU negotiators agree on landmark rules to effectively tackle the spread of illegal content online and protect people's fundamental rights in the digital sphere.

> Access to platforms’ algorithms now possible
> Online platforms will have to remove illegal products, services or content swiftly after they have been reported
> Protection of minors online reinforced; additional bans on targeted advertising for minors as well as targeting based on sensitive data
> Users will be better informed how content is recommended to them

The Digital Services Act will set new global standards. Citizens will have better control over how their data are used by online platforms and big tech-companies. We have finally made sure that what is illegal offline is also illegal online. For the European Parliament, additional obligations on algorithmic transparency and disinformation are important achievements,” said rapporteur Christel Schaldemose (DK, S&D). “These new rules also guarantee more choice for users and new obligations for platforms on targeted ads, including bans to target minors and restricting data harvesting for profiling.”

The Commission made its proposal on the Digital Services Act on 15 December 2020, together with the proposal for the Digital Markets Act, on which the European Parliament and Council reached a political agreement on 22 March 2022, an updated Q&A document is available here. The political agreements on these two files will work together to ensure a safe, open and fair online environment in the EU.

> More infon on the agreement ? Digital Services Act: Commission welcomes political agreement on rules ensuring a safe and accountable online environment !

More info on the Digital Services Act, ensuring a safe and accountable online environment ?">> More info on the Digital Services Act, ensuring a safe and accountable online environment ?

A new framework for digital services

The new framework under the DSA is founded on European values, including the respect of human rights, freedom, democracy, equality and the rule of law. It will rebalance the rights and responsibilities of users, online intermediaries, including online platforms as well as very large online platforms, and public authorities.

The DSA contains EU-wide due diligence obligations that will apply to all digital services that connect consumers to goods, services, or content, including new procedures for faster removal of illegal content as well as comprehensive protection for users' fundamental rights online.

In scope of the DSA are various online intermediary services. Their obligations under the DSA depend on their role, size, and impact on the online ecosystem. These online intermediary services include:

  • Intermediary services offering network infrastructure: Internet access providers, domain name registrars;
  • Hosting services such as cloud computing and webhosting services;
  • Very large online search engines with more than 10% of the 450 million consumers in the EU, and therefore, more responsibility in curbing illegal content online;
  • Online platforms bringing together sellers and consumers such as online marketplaces, app stores, collaborative economy platforms and social media platforms;
  • Very large online platforms, with a reach of more than 10% of the 450 million consumers in the EU, which could pose particular risks in the dissemination of illegal content and societal harms.

Concretely, the DSA contains:

  • Measures to counter illegal goods, services or content online, such as:
    • a mechanism for users to easily flag such content and for platforms to cooperate with so-called ‘trusted flaggers';
    • new obligations on traceability of business users in online market places;
  • New measures to empower users and civil society, including:
    • the possibility to challenge platforms' content moderation decisions and seek redress, either via an out-of-court dispute mechanism or judicial redress;
    • provision of access to vetted researchers to the key data of the largest platforms and provision of access to NGOs as regards access to public data, to provide more insight into how online risks evolve;
    • transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users;
  • Measures to assess and mitigate risks, such as:
    • obligations for very large platforms and very large online search engines to take risk-based action to prevent the misuse of their systems and undergo independent audits of their risk management systems;
    • Mechanisms to adapt swiftly and efficiently in reaction to crises affecting public security or public health;
    • New safeguards for the protection of minors and limits on the use of sensitive personal data for targeted advertising.
  • Enhanced supervision and enforcement by the Commission when it comes to very large online platforms. The supervisory and enforcement framework also confirms important role for the independent Digital Services Coordinators and Board for Digital Services.

More responsible online platforms

Under the new rules, intermediary services, namely online platforms - such as social media and marketplaces - will have to take measures to protect their users from illegal content, goods and services.

  • Algorithmic accountability: the European Commission as well as the member states will have access to the algorithms of very large online platforms;
  • Swift removal of illegal content online, including products, services: a clearer “notice and action” procedure where users will be empowered to report illegal content online and online platforms will have to act quickly;
  • Fundamental rights to be protected also online: stronger safeguards to ensure notices are processed in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including the freedom of expression and data protection;
  • More responsible online marketplaces: they have to ensure that consumers can purchase safe products or services online, by strengthening checks to prove that the information provided by traders is reliable (“Know Your Business Customer” principle) and make efforts to prevent illegal content appearing on their platforms, including through random checks;
  • Victims of cyber violence will be better protected especially against non-consensual sharing (revenge porn) with immediate takedowns;
  • Penalties: online platforms and search engines can be fined up to 6% of their worldwide turnover. In the case of very large online platforms (with more that 45 million users), the EU Commission will have exclusive power to demand compliance;
  • Fewer burdens and more time to adapt for SMEs: longer period to apply the new rules will support innovation in the digital economy. The Commission will follow closely the potential economic effects of the new obligations on small businesses.

Safer online space for users

  • New transparency obligations for platforms will allow users to be better informed about how content is recommended to them (recommender systems) and to choose at least one option not based on profiling;
  • Online advertising: users will have better control over how their personal data are used. Targeted advertising is banned when it comes to sensitive data (e.g. based on sexual orientation, religion, ethnicity);
  • Protection of minors: platforms accessible to minors will have to take specific measures to protect them, including by fully banning targeted advertising;
  • Manipulating users’ choices through ‘dark patterns’ will be prohibited: online platforms and marketplaces should not nudge people into using their services, for example by giving more prominence to a particular choice or urging the recipient to change their choice via interfering pop-ups. Moreover, cancelling a subscription for a service should become as easy as subscribing to it;
  • Compensation: recipients of digital services will have a right to seek redress for any damages or loss suffered due to infringements by platforms.

Harmful content and disinformation

Very large online platforms will have to comply with stricter obligations under the DSA, proportionate to the significant societal risks they pose when disseminating illegal and harmful content, including disinformation.

  • Very large online platforms will have to assess and mitigate systemic risks and be subject to independent audits each year. In addition, those large platforms that use so-called “recommender systems” (algorithms that determine what users see) must provide at least one option that is not based on profiling;
  • Special measures in times of crisis: when a crisis occurs, such as a public security or health threat, the Commission may require very large platforms to limit any urgent threats on its platforms. These specific actions are limited to three months.

Next steps

The text will need to be finalised at technical level and verified by lawyer-linguists, before both Parliament and Council give their formal approval. Once this process is completed, it will come into force 20 days after its publication in the EU Official Journal and the rules will start to apply 15 months later.

From 23 to 27 May, a delegation from the EP’s Internal Market Committee will visit several company headquarters (Meta, Google, Apple and others) in Silicon Valley to discuss in person the Digital Services Act package, and other digital legislation in the pipeline, and hear the position of American companies, start-ups, academia and government officials.

Further information

Source : European Parlement, pressroom, 22 April 2022 - European Commission : 23 april 2022

Mots clés

Articles recommandés

First green light to new bill on firms’ impact on human rights and environment

New cyber resilience standards to protect all digital products in the EU from cyber threats.