Skip to content

ICO issues guidance on content moderation technologies and processes

Browse this blog post

Related news and insights

Blog Post: 23 November 2023

D&I in FS: Data protection and D&I reporting

Blog Post: 28 September 2023

UK lays regulations before Parliament to establish the UK-US data bridge

Blog Post: 18 April 2023

New financial promotion regime for crypto

Blog Post: 15 March 2023

FCA and PRA enforcement trends: operational resilience

The UK Information Commissioner’s Office (ICO) issued guidance on content moderation technologies and processes for the first time (the Guidance). In its press release on 16 February 2024, the ICO flagged the need for content moderation, for example under laws such as the Online Safety Act 2023.

The Guidance provides that user-generated content is content that a user either generates directly on the relevant service, or content that the user uploads / shares on the service, that in each case can be encountered by another user of the service. Moderation can involve either (i) the analysis of user-generated content to assess whether it meets certain standards; or (ii) any action taken by a service as a result of such analysis. Content moderation therefore encompasses a wide range of activities, from the removal of illegal or harmful content, to reducing the visibility of certain content on users’ news feeds.

In order to analyse content, organisations may use both automated systems and human review. Automated systems can assist in checking whether content is illegal or harmful, as well as the likelihood that the content breaches the organisation’s service content policies. Organisations may also use human moderators to manually check content that has previously gone through an automated process. In each case, personal information is likely to be involved given that the content is either directly about a person, or it can be connected to other information in such a way that it makes someone identifiable. The Guidance includes advice on how organisations should carry out content moderation in compliance with data protection law and addresses:

  • risk assessments and DPIAs;
  • the identification of a lawful basis for the processing of personal data;
  • ensuring personal data is processed in a way that is adequate, relevant and limited to what is necessary;
  • processing data in a fair and reasonable manner;
  • transparency requirements;
  • regularly reviews of data retention periods;
  • how to approach special category data;
  • security;
  • data subject rights;
  • the sharing of content moderation information, including international transfers; and
  • the roles of parties within the ecosystem (e.g. controllers or processors).

This guidance forms part of the ICO’s ongoing collaboration with Ofcom regarding data protection and online safety.

The Guidance is available here and the press release here.

 

Related blog topics