you are being redirected

You will be redirected to the website of our parent company, Schönherr Rechtsanwälte GmbH : www.schoenherr.eu

01 February 2023
roadmap

A hitchhiker's guide through platform liability

Platform providers are digital service providers who act as information intermediaries between their users. They store the information provided (making them hosts) and disseminate it to the public on user request. As such, platform services like Facebook, YouTube or TikTok are used to facilitate the interaction between its users.

When it comes to the liability of platform providers, a distinction has to be made between liability in connection with activities carried out or content uploaded by users and liability from sources directly attributable to the provider, such as contractual liability towards users, or liability for non-compliance with obligations under the new Digital Service Act ("DSA").

Host provider liability exemption

It is often difficult for individuals whose rights are infringed by content on a platform to sue the primary infringer, i.e. the uploader of the infringing content, since usually they are unaware of the identity of the infringer. Consequently, infringed parties try to address their claims against the platform provider instead. To protect these providers from excessive liability, the EU legislator has established a liability exemption for host providers, i.e. also platform providers, through the e-commerce Directive ("ECD"), under which host providers will not be liable for the information stored at the request of a user on condition that the provider:

  1. does not have actual knowledge of illegal activity or information and, regarding claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or
  2. upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the content.

While parts of the ECD will soon be replaced by the DSA, the wording of the liability exemption largely remains untouched. However, the term "illegal information" will be replaced with "illegal content", which is defined as any information that in itself or in relation to an activity is not in compliance with EU law or the law of any Member State. The range of content that can be subsumed under this is very broad and includes hate speech, violations of personal rights or copyright infringements.

If the provider becomes aware of illegal content on its platform, it may only rely on the liability exemption if it removes or disables access to the illegal content without undue delay upon obtaining such knowledge or awareness. The ECJ further clarified that knowledge of a specific unlawful act is required while knowledge "in a general sense" alone is not sufficient grounds to deny the provider access to the liability privilege. While no general monitoring obligation may be imposed on these providers, this does not affect orders by national authorities or courts to search for content that was previously declared to be unlawful, and to block access to identical and equivalent content – even worldwide.

To encourage providers' own initiatives against illegal content, the so-called Good Samaritan clause in the DSA stipulates that they should not be denied liability privileges merely because they take voluntary measures aimed at detecting, identifying and removing illegal content.

Copyright infringements: a special case

When it comes to copyright infringements, the key question is whether there has been a communication to the public only by the user (uploading the content and deciding whether to make it publicly available) or also by the platform provider itself. After all, if the provider's activity constitutes a communication to the public it cannot enjoy the liability exemption, since its active role is capable of having knowledge of or control over that content.

This assessment depends on whether the old regime, especially laid out in the CJEU's YouTube/Cyando decision, or the new regime under Art 17 of the Digital Single Market Directive ("DSM") applies.

The new regime: Art 17 DSM Directive

Art 17 of the DSM Directive is applicable to online content-sharing service providers of an information society service storing and giving the public access to a large amount of copyright-protected works or other protected subject matter uploaded by their users, which are undertaken for profit-making purposes.

The list of exclusions is not long:

  • not-for-profit online encyclopaedias as well as educational and scientific repositories;
  • open-source software-developing and sharing platforms;
  • providers of electronic communications services (telecommunications networks);
  • online marketplaces;
  • b2b cloud services / b2c cloud services that allow users to upload content for their own use.

If Art 17 is applicable, providers should obtain relevant authorisation from rightsholders. Additionally, the provider must be able to prove that they:

  • made best efforts to obtain an authorisation;
  • made best efforts to ensure the unavailability of specific works and other subject matter for which the rightsholders have provided the relevant and necessary information; and
  • acted expeditiously to disable access to or remove the respective content and made best efforts to prevent their future uploads.

Smaller and newer platforms enjoy certain exclusions from these obligations.

When deciding if the provider has complied with the above, the following (in line with the principle of proportionality) must be taken into account:

  • the type, audience and size of the service and the type of content uploaded by the users; and
  • the availability of suitable and effective means and their cost for the provider.

The old regime: YouTube/Cyando

When Art 17 does not apply, the activity of the platform provider only constitutes a communication to the public if it provides the public with access to such content beyond the mere provision of the platform itself. Thus, there can be a communication to the public if the platform provider plays a central role, i.e. the user cannot access the content without it and deliberately intervenes in the illegal communication.

The platform provider's activity constitutes a communication to the public:

  1. if it has specific knowledge that protected content is available illegally on its platform and refrains from expeditiously deleting it or blocking access to it; or
  2. where that operator despite having general knowledge that users are making protected content available to the public illegally via its platform and
    • refrains from putting in place the appropriate technological measures;
    • participates in the selection of illegally shared protected content;
    • provides tools on its platform specifically intended for the illegal sharing of such content;
    • knowingly promotes such sharing; or
    • the (principal) purpose of the platform is sharing of illegal content.

Appropriate measures providers can take to avoid this communication to the public include informing users about the prohibition of sharing illegal content (e.g. in terms of use, during the upload process) or introducing tools to combat copyright infringements on its platform in a credible and effective manner.

New obligations and penalties under the DSA

Platform providers will have to comply with three categories of new DSA obligations.

The first is applicable to all platform providers, regardless of size. These include the disclosure of additional information on content moderation policies or the implementation of a notice-and-action mechanism enabling users to submit notices about allegedly illegal content. The provider must process notices in a timely manner or risk losing its liability privileges.

The second group of obligations also applies to all providers, except for micro and small enterprises. These include publishing an annual transparency report, implementing an internal complaint handling mechanism, transparency with regard to advertising or the prohibition of dark patterns.

The third group of obligations only applies to providers with more than 45m active users per month that have been designated by the Commission as a "very large online platform". In addition to the other measures, these platforms must inter alia identify and mitigate systematic risks or conduct external compliance audits.

It can be difficult for providers to navigate through this jungle of new obligations. Consideration should therefore be given to seeking legal advice at an early stage, as breaches of the DSA may result in fines of up to 6 % of annual worldwide turnover.

 

authors: Roland Vesenmayer, Valentin Demschik, Daria Rutecka, Tullia Veronesi