European Parliament adopts its position on the proposed Digital Services Act
Browse this blog post
The DSA aims to modernise and establish an EU-wide uniform framework on the handling of illegal or potentially harmful content online, the liability of online intermediaries for third party content, the protection of users' fundamental rights online and bridging the information asymmetries between online intermediaries and their users. You can read about the key aspects of the European Commission’s proposal for the DSA, issued on 15 December 2020, in our blogs here and here.
Compared to the European Commission’s proposal, the key amendments introduced by the European Parliament include the following:
- targeting or amplification techniques that process, reveal or infer personal data of minors and vulnerable groups, as well as special categories of data as defined by Article 9 GDPR (e.g. personal data related to political beliefs, religion or sexual orientation), are prohibited;
- expanding the provisions that impose additional information and transparency requirements for targeted advertisements, including in relation to informing users of digital services how their data will be monetised. The amendments will require online platforms to ensure that recipients of their services can refuse or withdraw consent for targeted ads in a way that is not more difficult or time-consuming than giving consent. Refusing consent to processing personal data for advertising purposes should not result in disabling access to the platform functionalities and alternative access options (including tracking-free options) should be fair and reasonable;
- users of digital services and organisations representing them must be able to seek compensation for any direct damage or loss resulting from platforms not respecting their due diligence obligations;
- providers of intermediary services are not permitted to use deceiving or nudging techniques to influence users’ behaviour through “dark patterns”. New provisions in the DSA prohibit providers from giving visual prominence to any of the consent options and repeatedly requesting consent to data processing where consent was previously refused (namely through the use of pop-ups);
- expanding the additional obligations on very large online platforms. For example, when conducting mandatory risk assessments and implementing risk mitigation measures, these providers will need to take into account fundamental rights. There are strict requirements for independent audits (e.g. to prevent partiality or dependency of auditors), ensuring the transparency of “recommender systems” (the algorithms that determine what users see) and providing users with at least one option which is not based on profiling; and
- the terms and conditions of the intermediary service providers must be fair, non-discriminatory and transparent, respect fundamental human rights and freedoms, and be made available in the language of the Member State towards which the service is directed. These terms and conditions should be formulated in clear and unambiguous terms and include information on any policies, procedures, measures and tools used for content moderation, including algorithmic decision-making, human review and the right to terminate the service. These terms and conditions should be supplemented by a concise and easily readable summary of the key elements. Terms that do not comply with these requirements are not binding on service recipients.
Negotiations between the European Parliament and the Council of the EU to agree the text of DSA are expected to follow. The Council of the EU agreed its position on the DSA on 25 November 2021. You can read a summary of the Council’s position, prepared by Allen & Overy.