The Online Safety Act (OSA) is finally here - businesses now turn to implementation questions
Browse this blog post
Related news and insights
Blog Post: 05 December 2023
Publications: 05 December 2023
Blog Post: 23 November 2023
Blog Post: 21 November 2023
On 19 September, the Online Safety Bill passed its final Parliamentary debate and is now ready to become law via Royal Assent (probably in October). The finalised Bill also confirms that Ofcom will be the UK’s online safety regulator.
It has been an immensely long journey that can be traced back some six years to the Internet Safety Strategy in 2017, the Online Harms White Paper in 2020, the draft Online Safety Bill in 2021, and the Bill being introduced to Parliament in 2022.
At A&O, we’ve blogged numerous updates about online safety over the last few years:
- 2021: UK Government publishes details of digital regulation to combat Online Harms
- 2022: The UK Online Safety Bill – new measures to protect digital users – but questions remain
- 2022: Online Safety Bill: powering-up Ofcom’s investigation and enforcement powers
These blogs have summarised the key features of the Bill so we will not repeat the full structure and requirements here. However, it is worth noting the Government’s key expectations of companies that arise from the legislation, recognising that the extent of obligations differ between ‘Category 1’ and ‘Category 2’ services, and will be most extensive where they are likely to be accessed by children. The Government’s press release following completion of the Parliamentary process highlighted the following requirements for online platforms. They must:
- remove illegal content quickly or, for certain types of “priority” illegal content, prevent it appearing in the first place. This monitoring obligation is an example of divergence from the existing intermediary regime in place under the E-commerce Regulations. It also differs from the EU approach under the Digital Services Act;
- enforce the promises they make to users when they sign up, through terms and conditions;
- offer users the option to filter out harmful content, such as bullying, that they do not want to see online;
- prevent children from accessing harmful and age-inappropriate content;
- enforce age limits and age-checking measures;
- ensure the risks and dangers posed to children on the largest platforms are more transparent, including by publishing risk assessments; and
- provide parents and children with clear and accessible ways to report problems online when they do arise.
Ofcom will also have various investigative and corrective powers, including the ability to fine companies up to £18 million or 10% of global turnover (whichever is higher).
What changed during the passage of the Bill?
There were a large volume of amendments to the Bill during its passage – some made by the Government as their policy evolved in response to new evidence and public concern, and others made by MPs and Peers. We cannot detail them all here but here are our top five takeaways on what has changed:
1) Focus placed on illegal content and protecting children – requirements for legal-but-harmful content removed. The OSA will now require Category 1 companies to offer empowerment tools for adults, so that users can control what content they see online. It will also require them to remove or restrict access to legal-but-harmful content only where this is consistent with their terms and conditions. There are also new requirements to reduce the likelihood of users encountering fraudulent advertisements.
2) Potentially wider scope – companies covered and range of risks. The legal test for assessing which services would come under Category 1 moved from a test of size and functionality to a test of size or functionality.
The test for child risk assessments was also broadened to consider “the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk”. This will mean that design/harm issues related to children will need to be considered separately and additionally to harm relating to the dissemination of or encountering harmful content.
3) Further powers for Ofcom to tackle harms caused by messages sent via end-to-end encryption. Controversially, the OSA would give Ofcom the power to require service providers to deploy accredited technology to address terrorism and Child Sexual Exploitation and Abuse (CSEA) content and/or develop and source technology to tackle CSEA content. This could include measures addressed to end-to-end encryption. In September 2023, the Government stated that service providers will not be required to scan encrypted messages until it is “technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content”, although this is not in the OSA itself.
4) Stronger sanctions for senior managers. An offence was added that would hold senior managers liable for a platform’s failure to comply with a CSEA requirement imposed by a confirmation decision from Ofcom. In the most serious and exceptional cases custodial sentences could be imposed.
5) New measures to prevent children accessing pornographic content. This will require use of age estimation or age verification.
This research paper by the House Commons Library from August 2023 provides a more detailed overview of how the Bill evolved.
New governance for online safety and leveraging existing governance from other compliance regimes
As companies now step up their plans for compliance, questions will arise about the how an online safety governance programme should be developed and implemented - there are many novel issues in the OSA that will require new skills and competencies. Companies will also be able to draw on compliance programmes for the Digital Services Act in the EU, which is already in force and moving through its different phases of implementation.
In a June 2023 podcast, colleagues at A&O (Maeve Hanna, Benjamin Scrace and Steve Wood) explored how organisations can look to familiar regimes, such as data protection or financial services regulation, to leverage previous experience and gain insights to support compliance with their new online safety obligations.
Companies will also need to consider how OSA compliance requirements interact with other areas of regulation, such as the GDPR, given the privacy risks posed by OSA requirements such as age assurance and use of proactive technologies to address illegal content.
This is particularly the case given the broad scope of the OSA such that organisations based outside the UK, and bound by a range of extra-territorial obligations, may well be caught by the reach of the OSA too.
The roadmap for implementation
Ofcom has set out their key milestones for implementation:
• Phase one: illegal harms duties
Ofcom expects to publish the relevant codes shortly after commencement. These will set out measures that regulated services can take to mitigate the risk of illegal harms.
Ofcom will also publish:
- a register of risks relating to illegal content, and risk profiles of service characteristics that their assessment has found to be associated with heightened risk of harm;
- draft guidance to services on how to conduct their own risk assessments and on how services can identify illegal content;
- draft guidance on record keeping; and
- draft enforcement guidelines.
• Phase two: child safety duties and pornography
Ofcom plans to publish a consultation on draft guidance on age assurance from Autumn 2023.
Ofcom will draft codes of practice relating to protection of children, around six months after its powers commence.
It also expects to consult on a register of risks and risk profiles relating to harms to children, and a draft risk assessment guidance focusing on children’s harms.
• Phase three: transparency, user empowerment, and other duties on categorised platforms
Ofcom’s final stage of implementation focuses on additional duties that fall only on designated Category 1, 2A or 2B services (if they meet certain thresholds set out in secondary legislation to be made by Government). Ofcom will send advice to Government about categorisation around six months after Royal Assent and will publish the register of services once the secondary legislation has passed.
This phase will also include calls for evidence and consultation on the remaining codes and guidance, including on transparency requirements.
The codes of practice Ofcom is required to develop under the OSA will also be subject to Parliamentary approval.
Action for businesses
Much of the detail, the thresholds and specific obligations will be set out in secondary legislation and codes still to come. Nonetheless, we have been working with companies to help them understand if they are likely to be caught by the new rules and, if so, to advise and support them on the road to implementation. Companies in scope can also consider responding to the upcoming consultations to ensure their views are taken into account.