As part of the European Commissions' digital strategy3 for the EU, the DSA aims to transform the internet into a safer space for users in Europe. Its focus is on protecting the fundamental rights of users and tackling illegal content and misinformation in the context of digital services. By creating extensive obligations for providers of digital services regarding, e.g. targeted advertisements, content moderation, transparency for terms and conditions, internal "complaint-handling systems" and the like, websites and online platforms are to become more transparent.
Who is affected?
The DSA targets many online services, ranging from websites to online/internet infrastructure services and online platforms.
Pursuant to Art 2, the DSA applies to all information society services (or "intermediary services") offered to recipients that are established or located in the EU, irrespective of where the service provider is established. Intermediary service means (i) a mere conduit service, providing access to or transmitting information in a communication network, (ii) a caching service, transmitting information in a communication network also involving the temporary storage of that information, and (iii) a hosting service, consisting of the storage of information provided by a user (or "recipient") of the service.
This means that the DSA applies not only to internet access providers, but also to cloud and webhosting services, search engines, online platforms such as social media platforms, online marketplaces, app stores and collaborative economy platforms, among others. All those services, whether established in the EU or not, will have to comply with the DSAs' regulations.
SMEs will benefit from numerous provisions adapted to their respective size and capabilities, without reducing their accountability under the DSA.4 Likewise, there will be specific provisions which provide for additional obligations for very large online platforms ("VLOP") or online search engines ("VLOSE") (i.e. platforms or search engines with an average of at least 45 million active users per month).5
What are the key points?
The DSA explicitly states that intermediary services have no obligation to monitor all the information they transmit or store "to seek facts or circumstances indicating illegal activity".6 This means that the widely discussed obligation to implement upload filters did not make it into the final text of the DSA.
Other things that affect all intermediaries include:
- Intermediaries must follow standardised rules upon the receipt of an order from the relevant national court or authority to act against illegal content and inform the court/authority without undue delay about its fulfilment.
- Intermediaries must designate a single contact point to enable direct communication with authorities, the Commission and other relevant bodies. Also, a single point of contact for users of the service to communicate with the intermediary must be designated.
- Terms and conditions: Information must be provided in the terms and conditions about internal policies, procedures, measures and tools used for content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of the internal complaint handling system. This information must be in clear, plain, intelligible, user-friendly and unambiguous language, and must be publicly available in an easily accessible and machine-readable format.
- Reporting obligation: Intermediaries must publish reports on content moderation at least once a year containing information on the type of illegal content, the time needed to delete the content, the content moderation engaged at own initiative, training and assistance measures to persons in charge of content moderation, measures that affect the availability of information provided by users, the number of complaints received through the internal complaint-handling systems, any automated means used for content moderation, etc.7 Additional semi-annual reporting obligations for providers of online platforms apply.
Additional obligations for online platforms
- Action mechanisms must be in place to allow users to notify intermediaries about illegal content. The obligation to implement an internal complaint-handling system that enables users who have submitted a notification or are concerned by an intermediary's decision (e.g. to remove content, to suspend or terminate the users account, etc.) to electronically lodge complaints against the decision taken free of charge. The obligation to suspend users if they frequently provide illegal content. Users must also have the option to select a certified out-of-court dispute settlement body.
- Notices of trusted flaggers (who must be competent for identifying illegal content, independent and objective8) must be prioritised and processed without undue delay.
- Prohibition to use "dark patterns": The interface of an online platform must not be designed, operated or organised in a way that deceives or manipulates the users of the service or impairs their ability to make a free and informed choice.
- Information on advertisements: Users must be able to clearly, concisely and in real time identify that the information at hand is an advertisement, and establish the identity of the advertiser who paid for the advertisement and the parameters used to determine the recipient of the advertisements and how to change them. Also, a prohibition to present advertisements based on profiling using special categories of personal data.9
- Obligation to set out in plain and intelligible language in the terms and conditions the main parameters used to determine the order of information presented to the user. The user must be provided with a function to modify the parameters.
For online platforms allowing consumers to conclude distance contracts with traders, the following additional obligations apply:
- The trader must be identified by the intermediary through the means of an identification document or other electronic identification.
Additional obligations for VLOPs and VLOSEs
- Reporting obligations on the number of monthly active users (starting from 17 February 2023 and at least semi-annual thereafter).
- Risk assessment for risks stemming from the design or functioning of the service and its related (algorithmic) systems and, based on its outcome, the implementation of reasonable, proportionate and effective mitigation measures.
- The obligation to assess compliance with the DSA at least once a year through an independent audit at its own expense.
- Additional advertising transparency obligations.
- Access to data necessary to monitor and assess compliance with the DSA for the Digital Services Coordinator and the Commission and the obligation to explain the design, logic, functioning and testing of the algorithmic systems.
- The obligation to implement an independent compliance function with sufficient authority and resources to ensure compliance with the DSA.
- Additional reporting obligations.
Pursuant to Art 49 DSA, Member States must designate the authorities responsible for supervising the intermediaries and enforcing the DSA, as well as a Digital Services Coordinator responsible for coordination at a national level. Notably, the Commission itself will be the administrative body responsible for supervising and enforcing the DSA vis-à-vis VLOPs and VLOSEs to prevent a "bottleneck" effect in DSA enforcement due to inadequately staffed or funded national authorities.10
The penalties following an infringement of the DSA are quite harsh and can amount to up to 6 % of the annual worldwide turnover achieved by the intermediary in the preceding financial year. This means that fines are even higher than under the GDPR (4 % of worldwide turnover).
Timeline and to-dos
After being published in the Official Journal of the European Union, the DSA will enter into force by 16 November 2022 and will be effective only 15 months later by 17 February 2024.
However, certain obligations for VLOPs and VLOSEs will apply as early as 16 November 2022. These are, among others, the reporting obligation on the number of monthly users, the Commission's competence to adopt delegated acts to designate online platforms or search engines as VLOPs or VLOSEs, to lay down rules for the performance of audits and to establish the technical conditions for data access.
Considering the extensive internal changes (organisational, technical and procedural) required for intermediaries to comply with the DSA's regulations and the severe fines, intermediaries should waste no time in setting the necessary requirements for DSA compliance.
Steps to follow
- Analyse and classify: What type of intermediary is my organisation/company? Which provisions are applicable?
- Determine the status quo: Where is the organisation/company already compliant with the DSA and where is it not? Identify deviations from the DSA's requirements.
- Lay the foundations: Start implementing the DSA by creating appropriate technical, personnel, structural and organisational measures.
- Monitor: The DSA provides for various possibilities for the Member States and the Commission to adopt (delegated) national acts. Monitoring the legislation of the EU and the relevant Member States will therefore be mandatory.
- Check: Conduct an internal or external audit for compliance before the applicable cut-off date.
1 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).
2 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ:L:2022:277:TOC (accessed on 27 October 2022).
3 For more information on the digital strategy see https://www.schoenherr.eu/content/the-european-data-act/; https://digital-strategy.ec.europa.eu/en.
4 See Art 29 DSA.
5 See Art 33 DSA et seq.
6 See Art 8 DSA.
7 See Art 11 – 16 DSA.
8 See Art 22: The status "trusted flagger" may be awarded upon application by the Digital Services Coordinator of the applicant's Member State.
9 See Art 9 (1) GDPR: This means data on racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data, health data or data concerning a natural person's sex life or sexual orientation.
10 See Art 56 DSA.