Arizona Antelope Canyon

Online Safety Bill receives Royal Assent

November 09, 2023

The Online Safety Bill received Royal Assent on 26 October 2023, becoming the Online Safety Act 2023 (OSA). This landmark legislation has been the topic of much discussion, both in Parliament and in the technology sector, since October 2017, when the policy papers and consultations on online safety were introduced.

 

Overview

The OSA establishes a new regulatory regime aimed at addressing illegal and harmful content online. It imposes duties on companies that operate a wide range of popular online services with a view to keeping people, especially children, safe online. It requires companies to do this by assessing and managing safety risks arising from content and conduct on their sites and apps. Services in the scope of the new rules include user-to-user services, such as social media photo- and video-sharing services, chat and instant messaging platforms, online and mobile gaming, as well as search services and pornography sites.

The UK Government declared that the new law would take a “zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online”. The Office of Communications, commonly known as Ofcom, will be tasked as the online safety regulator, and the OSA provides it with additional enforcement powers aimed at ensuring that technology firms take steps to protect their users from illegal content. “Largest and riskiest” service providers, such as social media platforms, will be grouped as “Category 1 providers” and must also protect their users from other harms such as bullying, self-harm and pornography content.

 

Risk assessment

The OSA sets out specific duties to carry out “suitable and sufficient risk assessments” in relation to illegal content and content that is harmful to children, as well as duties to “take appropriate steps” to keep such risk assessments up to date, and to carry out risk assessments before making any significant change to any aspect of a service’s design or operation. It also sets out duties to “use or take appropriate measures” and to use “proportionate systems and processes designed to” prevent the relevant harms from occurring, including in relation to content of democratic importance, to prevent individuals from encountering fraudulent advertising, and to protect free expression and journalistic content. The Act also sets out duties in relation to the bringing of complaints, record-keeping and “transparency reporting”.

 

Enforcement

The OSA provides Ofcom with a range of enforcement powers. Depending on the provision, non-compliance may lead to service or access restriction orders, or penalties including fines of up to 10% of qualifying worldwide annual revenue or £18 million, whichever is greater. 

 

Use of new “accredited” technologies

One of the more controversial provisions in the OSA is the power it grants Ofcom to give notice to a regulated user-to-user service or search service to use “accredited technology” to identify harmful content, such as terrorism and child sexual exploitation or abuse. Concerns have been raised that this challenges end-to-end encryption used by many platforms to prevent access to their users’ personal data. To comply with this notice, service providers would need to be able to access the data shared on their service and review it for potentially harmful content, which may not currently be possible without breaking that encryption. This provision of the OSA will only apply once a technical solution is accredited by Ofcom based on its accuracy in detecting the relevant harmful content. Moreover, the OSA allows for the use of such powers only where it is considered to be “necessary and proportionate”.

 

Wider legislative landscape

The OSA forms part of the existing and emerging laws in the UK and EU requiring companies to conduct risk assessments and exercise due diligence to address a range of potential harms, such as bribery, tax evasion, fraud and human rights impacts. Indeed, the OSA becomes law at the same time as the UK Economic Crime and Corporate Transparency Act, which introduces amongst other things a new offence of failing to prevent fraud.

In parallel, the EU has developed its own legislation to prevent online harms, in the form of the EU Digital Services Act (DSA). The DSA is aimed at addressing illegal content, transparent advertising and disinformation, including in relation to e-commerce.

The OSA and DSA are examples of laws relating to the “downstream” impacts in the value chain, namely those which affect the users of the companies’ goods and services. In a similar vein, the risks that may arise from the intended use and foreseeable misuse of AI systems are contemplated in the European Parliament’s proposal for an EU Artificial Intelligence Act. The EU Draft Directive on Corporate Sustainability Due Diligence (CSDDD) proposes to impose due diligence obligations for both “downstream” as well as “upstream” impacts in a company’s value chain, such as those that affect the human rights of workers, communities or the environment during the production of the goods or services.

 

Extraterritorial application

The OSA is extraterritorial insofar as it applies beyond UK incorporated companies, to those which have “links with” the UK. This may be compared to the UK Bribery Act 2010 and UK Modern Slavery Act 2015, which both reference companies that “carry on a part of their business” in the UK. Moreover, like the Bribery Act and Modern Slavery Act, the OSA applies to activities that take place outside of the UK. In particular, for the purposes of defining whether content is illegal in terms of the OSA, “no account is to be taken of whether or not anything done in relation to the content takes place” in the UK.

 

Next steps

Although the OSA is now law, Ofcom is in the process of drafting codes of practice and guidance notes, on which it will consult in due course. As laid out in the Act, these will be rolled out over three phases. Phase one will focus on illegal content. The first consultation on illegal harms – including child sexual abuse material, terrorist content and fraud – will be published on 9 November 2023 and will contain proposals for how service providers can comply with the illegal content safety duties and draft codes of practice.

Phase two is directed at child safety, pornography, and protecting women and girls. The first consultation, due in December 2023, will set out draft guidance for services that host pornographic content.  Further consultations relating to child safety duties under the Act will follow in Spring 2024.  Ofcom will publish draft guidance on protecting women and girls by Spring 2025.

Phase three will concentrate on additional duties for categorised services (services which meet certain criteria related to their number of users or high-risk features of their service). Ofcom’s advice and draft guidance will be published in Spring 2024, with a register of categorised services expected to be published by the end of 2024. Further proposals, including a draft code of practice on fraudulent advertising and transparency notices will follow in early and mid-2025 respectively, with final codes and guidance due to be published towards the end of that year.