Skip to content

UK government publishes details of digital regulation to combat Online Harms

Browse this blog post

Related news and insights

Blog Post: 29 February 2024

ICO issues guidance on content moderation technologies and processes

Blog Post: 23 November 2023

D&I in FS: Data protection and D&I reporting

Blog Post: 28 September 2023

UK lays regulations before Parliament to establish the UK-US data bridge

Blog Post: 18 April 2023

New financial promotion regime for crypto

On 15 December 2020, the UK government published its long-awaited proposals “to make the UK a safer place to be online” through its final response to the Online Harms White Paper. Proposing a new regulatory framework for online companies, it targets a wide range of illegal or harmful content affecting individual users. Committing to introducing the Online Safety Bill in 2021, the proposals would mark the end of the “era of self-regulation”, instead placing significant legal and practical responsibility on online companies.

Having launched the Online Harms White Paper in April 2019, calls for details of the government’s final proposals intensified over 2020 as increasing aspects of everyday life were conducted online during the Covid-19 pandemic. However, this is just one aspect of the current wave of international digital regulation, with the UK Digital Markets Taskforce recently publishing its recommendations for a new regime in the UK to govern the behaviour of digital firms with market power and the European Commission releasing its Digital Services Act package, also on 15 December 2020.

The regime will apply to companies anywhere in the world whose services host user-generated content and/or facilitate public or private online interaction between users, which can be accessed by users in the UK, as well as to search engines. There will be a differentiated expectation on companies depending on their size and scope of their activities.

At the heart of the proposals is a new duty of care on companies to improve the safety of their users online. However, the proposed “proportionate and risk based” regulation is essentially one of systems and controls – the government’s intention is that online platforms should have appropriate systems and processes in place to protect users, and that action should be taken against them if they fall short.

Ofcom has been confirmed as the regulator for online harms in the UK, building on its experience of its role regulating TV and radio programmes and video sharing platforms established in the UK. Its running costs will be paid by certain in-scope companies.

Ofcom will have a broad range of powers to enforce compliance with the duty of care, including issuing fines of up to 10% of a company’s annual global turnover or £18 million, whichever is higher, and imposing business disruption measures.

A draft Online Safety Bill is expected during 2021, so legislation could still be some time off by the time that the draft bill passes through Parliament. In the meantime, companies within scope should give thought to how the new rules would apply to them, including compliance with voluntary interim codes of practice published by government on how to tackle online terrorist and child sexual exploitation and abuse content and activity.

To read the full article please click here.

Related blog topics