Skip to content

Online content and take down requirements- what are the boundaries?

Emma Keeling - Senior PSL
Emma Keeling

Senior PSL


View profile →

14 November 2019

De-reference in the EU…..

Last month we reported on the Google/CNIL judgment, where the Court of Justice of the European Union (the CJEU) ruled that Google was not necessarily required to de-reference personal data on all its search engine domains in order to comply with the right to be forgotten under the General Data Protection Regulation (the GDPR).

A controller can comply with its obligations under the GDPR by: i) de-referencing personal data only from domain names having extensions associated with Member States; and ii) taking sufficient measures to prevent EU-based data subjects from having access to such de-referenced personal data.

The CJEU emphasised that EU law neither requires global de-referencing nor prohibits it, so it is within supervisory authorities’ competence to: (i) balance a data subject’s right to privacy and to the protection of personal data against the right to freedom of information, in the context of national standards of protection of fundamental rights; and (ii) order global de-referencing if appropriate.

October saw another CJEU ruling which suggests that a global response may be required by digital companies in some cases.

This ruling concerns the E-Commerce Directive (2000/31/EC) rather than the GDPR.

What constitutes adequate “take down” under the E-Commerce Directive?

In 2016 Eva Glawischnig-Piesczek (a member of the Austrian National Council, the Nationalrat) requested that Facebook delete defamatory posts made about her by a Facebook user. The Austrian court required Facebook to follow the request and the company disabled the content in Austria. Further appeals concerned the scope of the takedown - whether it should apply globally and whether it should apply to materials involving similar content to that of the defamatory text (rather than identical phrases). In order to clarify the position, the Austrian Supreme Court, the Oberster Gerichtshof, made a reference to the CJEU.

It is well understood that, under the E-Commerce Directive (specifically Article 14), an information society service provider such as Facebook, is not liable for stored information if it has no knowledge of its illegal nature or if it acts expeditiously to remove or to disable access to that information as soon as it becomes aware of it.

In addition, the E-Commerce Directive (specifically Article 15) prohibits Member States imposing a general obligation on providers to monitor information which it transmits or stores, or a general obligation to actively seek facts or circumstances indicating illegal activity.

It is also clear that a provider can be ordered by a court to terminate or prevent an infringement, including by removing the illegal information or by disabling access to it.

However, Mme. Glawischnig-Piesczek considered that the scope of Facebook’s obligations in this case should extend beyond removal of the specific content from Austrian sites and should also capture equivalent material that was accessible worldwide.

The CJEU ruled that the E-Commerce Directive does not preclude a court of a Member State from ordering a host provider:

  • “to remove information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information”;
  • “to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content”;
  • “to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law”.

[emphasis added]

A balancing act

The CJEU suggests that the judgment is not intended to require companies to actively monitor all material that was posted on their platforms, as the current provisions of the E-Commerce Directive specifically prevent such a general (as opposed to “specific”) monitoring obligation. Rather, monitoring of potentially harmful material should be linked to existing court rulings and be restricted to specific instances of harmful material that defamed individuals. The CJEU envisages that the host provider may have recourse to automated search tools and technologies for these purposes such that independent assessment of content is not required. The removal of access to content worldwide assumes that the actions of the relevant Member State applying the requirement is acting within international law. It is not applicable to forms of notice by users alleging content are illegal.

However, the judgment does not address the practicalities of application and Facebook has expressed concern regarding critical questions of freedom of expression and the role that internet companies should play in monitoring, interpreting and removing speech. Facebook has expressed hope that national courts weigh the effects of their injunctions on free expression rights and set clear definitions of “identical” and “equivalent” speech.

What next for online platforms?

The new European Commission, led by Ursula von der Leyen and due to take over before the end of 2019, has already flagged plans for a Digital Services Act in 2020. This proposed legislation is expected to address issues such as responsibilities for the types of content posted online and impose a duty of care on online platforms. It is anticipated that provisions regarding disinformation, online advertising and data access could also be covered by the legislation and those in the industry will be watching closely to understand if and how the liability regime for online content will change.