Index
- General information on the Digital Services Act
- Impact of the Digital Services Act on users
- Impact of the Digital Services Act on businesses
- Impact of the Digital Services Act on Member States
1. General information on the Digital Services Act
What is the Digital Services Act?
The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content.
It will give better protection to consumers and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and lead to fairer and more open digital markets.
Harmonised across the EU and directly applicable, the new rules will make it easier to provide digital innovations across borders, while ensuring the same level of protection to all citizens in the EU. Such obligations include:
- Measures to counter illegal content online, including goods and services , such as a mechanism for users to flag such content, and for platforms to cooperate with “trusted flaggers”;
- New rules on traceability of business users in online market places, to help identify sellers of illegal goods;
- Effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions;
- Transparency measures for online platforms that are wide-ranging, including on the algorithms used for recommendations;
- Obligations for very large online platforms to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures;
- Researchers will have access to data of key platforms, in order to scrutinise how platforms work and how online risks evolve;
- Oversight structure to address the complexity of the online space: Member States will have the primary role, supported by a new European Board for Digital Services; for very large online platforms, enhanced supervision and enforcement by the Commission.
Does the Digital Service Act include provisions for digital taxation?
No, the Commission’s proposal for an interim digital tax for revenue from digital activities is a separate initiative to the Digital Services Act. There are no provisions in the Digital Services Act in the field of taxation.
Does the Digital Services Act define what is illegal online?
No. The new rules will harmonise due diligence obligations for platforms and hosting services, and the conditions for liability exemptions for online intermediaries. It will not touch upon national or EU laws that specify what is illegal. At the same time, the proposal will help Member States in enforcing the law online, by establishing mechanisms for sending orders to service providers throughout the single market.
Will the Digital Services Act replace sector specific legislation?
No. The Digital Services Act sets the horizontal rules covering all services and all types of illegal content, including goods or services. It does not replace or amend, but it complements sector-specific legislation such as the Audiovisual Media Services Directive (AVMSD), the Directive on Copyright in the Digital Single Market, the Consumer Protection Acquis, or the Proposal for a Regulation on preventing the dissemination of terrorist content online.
What are the current rules and why do they have to be updated?
The e-Commerce Directive, adopted in 2000, sets the main legal framework for the provision of digital services in the EU. It is a horizontal legal framework that has been the cornerstone for regulating digital services in the European single market.
Much has changed in 20 years and the rules need to be upgraded. Online platforms have created significant benefits for consumers and innovation, and have facilitated cross-border trading within and outside the Union and opened new opportunities to a variety of European businesses and traders. At the same time, they are abused for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They pose particular risks for users’ rights, information flows and public participation.
The Digital Services Act builds on the rules of the E-Commerce Directive, and addresses the particular issues emerging around online intermediaries. Member States have regulated these services differently, creating barriers for smaller companies looking to expand and scale up across the EU and resulting in different levels of protection for European citizens.
With the Digital Services Act, unnecessary legal burdens will be lifted, fostering a better environment for innovation, growth and competitiveness, and facilitating the scaling up of smaller platforms, SMEs and start-ups. At the same time, it will equally protect all users in the EU, both as regards their safety from illegal goods, content or services, and as regards their fundamental rights.
What is the relevance of the regulation of intermediaries at global level?
The new rules are an important step in defending European values in the online space, including the respect of human rights, freedom, democracy, equality and the rule of law, and in reinforcing the competitiveness of digital services, consumer choice, and opportunities for online innovation.
The DSA proposal sets high standards for effective intervention, for due process and the protection of fundamental rights online; it preserves a balanced approach to the liability of intermediaries, and establishes effective measures for tackling illegal content and societal risks online. In doing so, the DSA aims at setting a benchmark for a regulatory approach to online intermediaries also at the global level.
Will these rules apply to companies outside of the EU?
They apply in the EU single market, without discrimination, including to those online intermediaries established outside of the European Union that offer their services in the single market. When not established in the EU, they will have to appoint a legal representative, as many companies already do as part of their obligations in other legal instruments. At the same time, online intermediaries will also benefit from the legal clarity of the liability exemptions and from a single set of rules when providing their services in the EU.
2. Impact on users
How will citizens benefit from the new rules?
Online platforms play an increasingly important role in the daily lives of Europeans. The rules will create a safer online experience for citizens to freely express their ideas, communicate and shop online by reducing their exposure to illegal activities and dangerous goods and ensuring the protection of fundamental rights.
Online platforms will need to identify their business users and clarify who is selling a product or offering a service; this will help track down rogue traders and will protect online shoppers against illegal products, such as counterfeit and dangerous products. At the same time, citizens will be able to notify illegal content, including products, that they encounter and contest the decisions made by online platforms when their content is removed: platforms are obliged to notify them of any decision taken, of the reason to take that decision and to provide for a mechanism to contest the decision.
In addition, specific rules will be introduced for very large online platforms given their systemic impact in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas. When such platforms recommend content, users will be able to modify the criteria used, and choose not to receive personalised recommendations. Citizens will not have to take these companies at their word; citizens will be able to scrutinise their actions through the reports of independent auditors and vetted researchers.
What measures does the legislation propose to counter illegal content?
It will set out effective means for all actors in the online ecosystem to counter illegal content, but also illegal goods and services.
Users will be empowered to report illegal content in an easy and effective way. A privileged channel will be created for trusted flaggers – entities which have demonstrated particular expertise and competence – to report illegal content to which platforms will have to react with priority. When enabled by national laws, Member State authorities will be able to order any platform operating in the EU, irrespective of where they are established, to remove illegal content.
Finally, very large online platforms will need to take mitigating measures at the level of the overall organisation of their service to protect their users from illegal content, goods and services.
How will the DSA protect people from unsafe or counterfeit goods?
The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal goods. Platforms will have mandatory procedures in place for removing illegal goods. Online marketplaces will also be requested to trace their traders (“know your business customer”). This will ensure a safe, transparent and trustworthy environment for consumers and discourage traders who abuse platforms from selling unsafe or counterfeit goods. Online platforms will further be requested to organise their online interfaces in a way that allows traders to comply with their information obligations towards consumers. A new system of trusted flaggers will also be available, e.g. for brand owners fighting counterfeit goods, for faster and easier flagging and removal of counterfeit goods. Public authorities will have new tools to order the removal of unsafe products directly. Very large online platforms will be subject to an audited risk assessment that will include an analysis on their vulnerability to illegal goods on their platforms, and their mitigation measures at this organizational level will be subject to annual audits, too.
How can harmful but not illegal content be effectively addressed?
To the extent that it is not illegal, harmful content should not be treated in the same way as illegal content. The new rules will only impose measures to remove or encourage removal of illegal content, in full respect of the freedom of expression.
The proposal focuses on fixing platforms’ vulnerabilities against their manipulation in order to amplify harmful behaviors, such as the spread of political disinformation, hoaxes and manipulation during pandemics, harms to vulnerable groups. A supervised risk-based approach will oblige very large platforms to assess and mitigate the risks their systems pose, including for protecting fundamental rights, public interests, public health and security, and to subject their assessments and measures to independent audit.
In addition, the proposal sets out a co-regulatory framework where service providers can work under codes of conduct to address negative impacts regarding the viral spread of illegal content as well as manipulative and abusive activities, which are particularly harmful for vulnerable recipients of the service, such as children and minors.
Further Commission guidance and a revised Code of Practice have already been announced as regards, in particular, online disinformation, which can build on the rules to be agreed under the DSA.
How will you keep a fair balance with fundamental rights such as the freedom of expression?
The text puts protection of freedom of expression at its very core. This includes protection from government interference in people’s freedom of expression and information. The horizontal rules against illegal content are carefully calibrated and accompanied by robust safeguards for freedom of expression and an effective right of redress – to avoid both under-removal and over-removal of content on grounds of illegality.
The proposal gives users and consumers the possibility to contest the decisions taken by the online platforms to remove their content, including when these decisions are based on platforms’ terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before courts.
The Digital Services Act proposes rules on transparency of content moderation decisions. For very large platforms, users and consumers will be able to have a better understanding of the ways these platforms impact our societies through audit reports and independent research.
How does the Digital Services Act fight disinformation?
Through the proposed rules on how platforms moderate content, on advertising, algorithmic processes and risk mitigation, it will aim to ensure that platforms – and in particular the very large ones – are more accountable and assume their responsibility for the actions they take and the systemic risks they pose, including on disinformation.
As a horizontal piece of legislation, the Digital Services Act will not explicitly address some of the very specific challenges related to disinformation, which will be further developed in the updated Code of Practice on Disinformation and the new Commission Guidance as announced in the European Democracy Action Plan.
How does the Digital Services Act regulate online advertising?
The Digital Services Act covers any type of advertising, from digital marketing to issues-based advertising and political ads and complements existing rules such as the General Data Protection Regulation, which already establishes, for example, rules on users’ consent or their right to object to targeted digital marketing.
These new rules will empower users in understanding and making informed decisions about the ads they see. They will have to be clearly informed whether and why they are targeted by each ad and who paid for the ad; they should also see very clearly when content is sponsored or organically posted on a platform. Notice and action obligations also apply for potentially illegal ads, as for any other type of content.
For very large online platforms, the societal stakes are higher, and the rules include additional measures to mitigate risks and enable oversight. They will have to maintain and provide access to ad repositories, allowing researchers, civil society and authorities to inspect how ads were displayed and how they were targeted. They will also need to assess whether and how their advertising systems are manipulated, and take measures to mitigate these risks.
The proposal is complemented by measures in the Digital Markets Act, in particular provisions for gatekeepers to increase transparency by providing information about the price of ads and the remuneration paid to the publisher. They also need to provide access to their performance measuring tools and the information necessary for advertisers and publishers to carry out their own independent verification of the ad inventory.
These measures will also be complemented by a forthcoming initiative on political advertising.
How does the Digital Services Act protect personal data?
The DSA has been designed in full compliance with existing rules on data protection, including the General Data Protection Regulation (GDPR) and the ePrivacy Directive, and does not modify the safeguards set out in these laws.
3. Impact on businesses
What digital services does the act cover?
The Digital Services Act applies to online intermediaries, which includes services such as internet service providers, cloud services, messaging, marketplaces, or social networks. These digital services transmit or store content of third parties. Specific due diligence obligations apply to hosting services, and in particular to online platforms, a subcategory of hosting services. Examples of online platforms include social networks, content-sharing platforms, app stores, online marketplaces, online travel and accommodation platforms. A subset of rules specified in the Digital Services Act focus on very large online platforms, which have a significant societal and economic impact, reaching at least 45 million users in the EU representing 10% of the population.
What impact will the Digital Services Act have on businesses?
The DSA modernises and clarifies rules dating back to the year 2000. It will set a global benchmark, under which online businesses will benefit from a modern, clear and transparent framework assuring that rights are respected and obligations are enforced.
Moreover, for online intermediaries, and in particular for hosting services and online platforms, the new rules will cut the costs of complying with 27 different regimes in the single market. This will be particularly important for innovative SMEs, start-ups and scale-ups, which will be able to scale at home and compete with very large players.
Other businesses will also benefit from the new set of rules. They will have access to simple and effective tools for flagging illegal activities that damage their trade, as well as internal and external redress mechanisms, affording them better protections against erroneous removal, limiting losses for legitimate businesses and entrepreneurs.
Furthermore, those providers which voluntarily take measures to further curb the dissemination of illegal content will be reassured that these measures cannot have the negative consequences of being unprotected from legal liability.
What impact will the Digital Services Act have on start-ups and innovation in general?
It will make the single market easier to navigate, lower the compliance costs and establish a level playing field. Fragmentation of the single market disproportionately disadvantages SMEs and start-ups wishing to grow, due to the absence of a large enough domestic market and to the costs of complying with many different legislations. The costs of fragmentation are much easier to bear for businesses which are already large.
A common, horizontal, harmonised rulebook applicable throughout the Digital Single Market will give SMEs, smaller platforms and start-ups access to cross-border customers in their critical growth phase.
How will the proposed Digital Services Act differentiate between small and big players?
The proposal sets asymmetric due diligence obligations on different types of intermediaries depending on the nature of their services as well as on their size and impact, to ensure that their services are not misused for illegal activities and that providers operate responsibly. Certain substantive obligations are limited only to very large online platforms, which have a central role in facilitating the public debate and economic transactions. Very small platforms are exempt from the majority of obligations.
By rebalancing responsibilities in the online ecosystem according to the size of the players, the proposal ensures that the regulatory costs of these new rules are proportionate.
What impacts will the proposed Digital Services Act have on platforms and very large platforms?
All platforms, except the smallest, will be required to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms, cooperate with trusted flaggers, take measures against abusive notices, deal with complaints, vet the credentials of third party suppliers, and provide user-facing transparency of online advertising.
In addition, very large online platforms, reaching 45 million users or more (i.e. representing 10% of the European population) are subject to specific rules due to the particular risks they pose in the dissemination of illegal content and societal harms,.
Very large online platforms will have to meet risk management obligations, external risk auditing and public accountability, provide transparency of their recommender systems and user choice for access to information, as well as share data with authorities and researchers.
What penalties will businesses face if they do not comply with the new rules?
The new enforcement mechanism, consisting of national and EU-level cooperation, will supervise how online intermediaries adapt their systems to the new requirements. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their Member State and/or for coordinating with specialist sectoral authorities. To do so, it will impose penalties, including financial fines. Each Member State will clearly specify the penalties in their national laws in line with the requirements set out in the Regulation, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance.
For the case of very large platforms, the Commission will have direct supervision powers and can, in the most serious cases, impose fines of up to 6% of the global turnover of a service provider.
The enforcement mechanism is not only limited to fines: the Digital Services Coordinator and the Commission will have the power to require immediate actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.
For rogue platforms refusing to comply with important obligations and thereby endangering people’s life and safety, it will be possible as a last resort to ask a court for a temporary suspension of their service, after involving all relevant parties.
4. Impact on Member States
How can the gap between laws in Member States be filled?
The experience and attempts of the last few years have shown that individual national action to rein in the problems related to the spread of illegal content online, in particular when very large online platforms are involved, falls short of effectively addressing the challenges at hand and protecting all Europeans from online harm. Moreover, uncoordinated national action puts additional hurdles on the smaller online businesses and start-ups who face significant compliance costs to be able to comply with all the different legislation. Updated and harmonised rules will better protect and empower all Europeans, both individuals and businesses.
The Digital Services Act will propose one set of rules for the entire EU. All citizens in the EU will have the same rights, a common enforcement system will see them protected in the same way and the rules for online platforms will be the same across the whole Union. This means standardised procedures for notifying illegal content, the same access to complaints and redress mechanisms across the single market, the same standard of transparency of content moderation or advertising systems, and the same supervised risk mitigation strategy where very large online platforms are concerned.
Which institutions will supervise the rules, and who will select them?
Member States will be required to designate competent authorities – the Digital Services Coordinators – for supervising compliance of the services established on their territory with the new rules and to participate in the EU cooperation mechanism of the proposed Digital Services Act. The Digital Services Coordinator will be an independent authority with strong requirements to perform their tasks impartially and transparently. The new Digital Services Coordinator within each Member State will be an important regulatory hub, ensuring coherence and digital competence.
The Digital Services Coordinators will cooperate within an independent advisory group, called the European Board for Digital Services, which can support with analysis, reports and recommendations as well as coordinating the new tool of joint investigations by Digital Services Coordinators.
Furthermore, the proposed Digital Services Act establishes an innovative and effective mechanism for supervising very large online platforms, with a direct role for the European Commission and under the advice and cooperation of the Digital Services Coordinators, and the new European Board for Digital Services.
What will the Commission’s role be in the supervision of platforms?
The enforcement of the proposed Digital Services Act is primarily a task of national competent authorities, notably the Digital Services Coordinators. However, a Digital Services Coordinator or the Board may refer an unresolved problem to the Commission. The Commission will then step in and ask the Digital Services Coordinator of the service provider’s country of establishment to ensure compliance with the Digital Services Act, and with the related substantive requirements under national or Union law.
In addition, when it comes to very large online platforms, the new rules will provide for enhanced supervision and enforcement with the active participation of the Commission. In case of persistent infringements, the Commission – on recommendation by the Board, at the invitation of the competent Digital Services Coordinator or on its own initiative – may initiate proceedings against very large online platforms. This will ensure speedy intervention in EU-wide cases where very large online platforms raise systemic risks and ensure the level of assistance required to deal with the complex technical and societal issues posed by the biggest online platforms.
Press contact
· Johannes BAHRKE
Phone
+32 2 295 86 15
· Charles MANOURY
Phone
+32 2 291 33 91
The European Commission is committed to personal data protection. Any personal data is processed in line with Regulation (EC) 2018/1725. All personal information processed by the Directorate-General for Communication / European Commission Representations is treated accordingly.