Legal development

The UK Online Safety Act 2023- What do you need to know?

Insight Hero Image

    The Ashurst Emerging Tech Series is a collection of briefings compiled by our UK and European Digital Economy Transactions team, designed to help businesses understand and prepare for the impacts of new and incoming digital services and emerging tech legislation.

    In this first briefing, we consider the UK Online Safety Act 2023 (OSA).

    Introduction

    The OSA is the UK's framework to improve and regulate certain internet services and digital communications.

    Under the OSA, Ofcom has been granted powers as the online safety regulator. The OSA will come into effect through a phased process, determined by what issues are deemed by Ofcom to be causing the most harm online. In the coming months, Ofcom will publish a number of codes of practice and guidance clarifying certain aspects of the OSA's application.

    While the OSA and the EU's Digital Services Act (DSA) have similar aims (and many businesses will fall under both pieces of legislation), they vary significantly in scope, the obligations they each impose and the nature of enforcement.

    Timeline

    Following a raft of delays and amendments since its initial publication, the Online Safety Bill received Royal Assent on 26 October 2023 and is now in force as the OSA.

    emerging tech v3

    We have set out details of Ofcom's roadmap below.

    What services are regulated by the Online Safety Act?

    The OSA covers a broad range of services which fall into the following categories:

    • user-to-user services – i.e. an internet service where content is generated directly, uploaded to or shared on the service by a user, which may be available to another user (e.g. social media platforms); or
    • search services – i.e. an internet service that allows users to search more than one website or database (e.g. a search engine or any other website that includes the functionality of a search engine).

    The definition of user-to-user service in the OSA is wide, extending beyond social media platforms and covering any service that enables one user to interact with content from another user, even if such interaction is not a material function or aim of that service. Although, notable exemptions apply, including to email, SMS/MMS and internal business communications.

    Depending on the type of service, or indeed the reach that such service has (see service categories section below), service "providers" are subject to a range of obligations under the OSA. Providers of a user-to-user service that includes a public search engine (and vice versa) will be required to comply with the obligations applying to each, defined as a "combined service" under the OSA.

    Currently most user-to-user and search services in the UK are not subject to any safety regulations, apart from a limited number of user-to-user services that are subject to the video sharing platform regime under the Communications Act 2003 – which will be repealed and replaced by the OSA.

    The OSA has a wide extra-territorial scope and applies to providers:

    1. that provide services to a significant number of users in the UK;
    2. that target the UK market;
    3. that offer services that can be used in the UK by individuals; and
    4. of services where there is reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK presented by user-generated content or search content on that service.

    The OSA also regulates providers of pornographic content, which we have not considered further in this briefing.

    Service categories with enhanced obligations

    Similar to the DSA, the OSA takes a risk-based approach to classifying organisations, making those that meet certain thresholds subject to additional duties (depending on their size, functionality and other defined factors). Secondary legislation will set out the thresholds for determining whether a service falls within any of the special categories of 1, 2A and 2B.

    diagram 1

    While Ofcom anticipates that 100,000 services in the UK alone are likely to be subject to the general scope of the OSA, most of these services will not fall into any of the three special categories – and so, most providers will not be subject to any enhanced obligations. Ofcom will prepare a list of emerging Category 1 services, as soon as possible after secondary legislation is passed, which will define the Category 1 conditions.

    In principle, categorisation as a Category 1 or Category 2A service is similar to the EU Commission's designation of very large online platforms (VLOPs) and very large online search engines (VLOSEs) under the DSA.

    Please see the Ofcom's roadmap section below for further details on the expected timing of the categorisation of services.

    What does the Online Safety Act require businesses to do?

    The OSA puts in place a large number and wide range of obligations on providers. These include duty of care obligations and obligations to prevent fraudulent advertising, remove illegal and terrorist content and protect content of "democratic importance".

    The OSA is expected to have the biggest impact on social media, digital messaging, search and online advertising services.

    1. Duty of care requirements

    Duty of care obligations include:

    • assessing user bases and the risks of harm to users of the service;
    • implementing "suitable and sufficient" risk assessments of the potential level, nature and severity of harm to individuals caused by the service;
    • taking proportionate steps to mitigate and manage harm to users, including to: (a) remove illegal content quickly and to prevent it appearing in the first place; and (b) prevent children from accessing illegal, harmful and/or age-inappropriate content;
    • putting in place systems and processes enabling certain types of content to be reported to the provider;
    • establishing complaints procedures for users that are transparent and easy to use, and which enable appropriate actions to be taken in response;
    • when implementing safety measures and policies, a duty to consider users' rights to freedom of expression and privacy; and
    • duties to keep records and review compliance with the OSA.

    2. Making users aware of their contractual rights

    As noted below, providers have obligations to identify and prevent certain categories of content on their service.

    Notwithstanding these obligations, the OSA requires providers to ensure that their terms of service include details of a user's right to bring a claim for breach of contract where content is removed, or where a user is banned or suspended for reasons related to content (in each case) in breach of the provider's terms of service.

    3. Terrorism content

    Ofcom may require providers to identify and prevent terrorism content through the use of "accredited technology". This means Ofcom will have the power to impose use of scanning technology without requiring to obtain a court order or judicial commissioner authorisation, effectively avoiding the statutory safeguards around existing surveillance powers under the Investigatory Powers Act 2016.

    This provision (and the equivalent provision for child sexual exploitation and abuse content – see below) has attracted a significant amount of media attention due to concerns that it could require providers to spy on individuals by requiring them to break end-to-end encryption of private messages.

    While the issue was ultimately kicked down the road by the Government advising that it is not currently "technically feasible" to scan encrypted private messages, we could well see this issue rearing its head again as technology develops.

    4. Child sexual exploitation and abuse (CSEA) content

    Providers must use accredited technology to identify and prevent CSEA content and must also put in place systems and processes to ensure that (as far as possible) detected but unreported CSEA content is reported to the National Crime Agency.

    5. Fraudulent advertising

    Category 1 and Category 2A service providers must operate their services using proportionate systems and processes designed to prevent and swiftly remove fraudulent advertising. The Government hopes that this change will increase user confidence in transacting over the internet.

    This provision also extends to influencers failing to declare payment for promoting products – meaning that these individuals could face higher penalties for breaching requirements.

    6. User empowerment duties

    To the extent proportionate, Category 1 service providers are required to give users greater control over certain types of content which they might be exposed on the service. This includes content which incites hatred against people of a particular race, religion or sexual orientation.

    7. Consideration of important categories of content

    Category 1 service providers have a duty to provide their services in a way that ensures that content moderation decisions take into account content deemed to be important under the OSA – with content of "democratic importance" (considering a wide diversity of political opinions), news publisher content and journalistic content given such status.

    8. Requirement to name a Senior Manager

    Ofcom may require a provider to designate a Senior Manager, who plays a significant role in the decision making and management of the provider. A Senior Manager can be tasked with responding to information requests from Ofcom and with ensuring compliance with the OSA more generally (see penalties section below for details of Senior Manager personal liability).

    9. Communication offences

    The OSA contains a number of new communications offences that apply to individuals. These include:

    • false communications - sending false communications intended to cause non-trivial psychological or physical harm to a likely audience and without reasonable excuse;
    • sending threatening communications - sending a threatening communication while intending (or being reckless as to whether) the recipient would fear that the threat would be carried out;
    • a new "epilepsy trolling" offence - relating to sending or showing flashing images electronically;
    • encouraging / assisting with self-harm - sending communications that encourage or assist a person in serious self-harm;
    • sending unsolicited sexual images - amending the Sexual Offences Act 2003; and
    • sharing or threatening to share intimate images without consent - amending the Sexual Offences Act 2003.

    Non-compliance

    Ofcom has various mechanisms available to it to enforce compliance with the OSA.

    1. Rights to gather information on compliance

    Ofcom has a number of information gathering rights. These include rights to request information and to investigate, inspect or audit a provider. Providers or Senior Managers can be fined or face other enforcement action if they fail to comply with any such requests. 

     2. Fines

    Ofcom may issue fines against providers of up to £18 million or, if higher, 10% of global annual turnover for breaches of the OSA (although fines must be appropriate and proportionate to the provider's non-compliance).

    Fines can also be levied directly against Senior Managers where they commit an "information offence", such as failing to respond to an Ofcom information request, or where they mislead Ofcom by providing false or encrypted responses.

     3. Business disruption measures

    Following court application, Ofcom can require a third party who provides an ancillary service (i.e. a service that facilitates provision of a service regulated by the OSA) to withdraw or restrict its ancillary service where a provider has failed to comply with an Ofcom order or pay a penalty under the OSA.

    The aim of these provisions is to disrupt the provider's business operations in light of its continued breach of the OSA.

    Super-complaints

    Providers may also be the subject of "super-complaints". These are complaints to Ofcom by "eligible entities" that a provider's service and/or conduct presents a material risk of causing significant:

    • harm or adverse impact to users or the public (or a specific group within either); or
    • adverse effect on the right to lawful freedom of expression.

    Complaints relating to a single service or individual provider will only be admissible if the complaint is considered by Ofcom to be of particular importance or relates to a particularly large number of users or members of the public.

    "Eligible entities" are to be defined in the Secretary of State's secondary legislation. They will include bodies representing the interests of users of regulated services, members of the public or a specific group within either category. Secondary legislation and Ofcom guidance will also clarify the impact of providers receiving super-complaints.

    Appeals

    Providers may appeal Ofcom's:

    • decisions in relation to their special categorisation on the register for Category 1, 2A and/or 2B services;
    • penalty notices;
    • notices related to terrorism content or CSEA content; and
    • decisions under any other notices issued in relation to compliance with an "enforceable requirement" – with enforceable requirements covering the majority of the obligations under the OSA.

    Ofcom's roadmap for implementation of the Online Safety Act

    Ofcom published its approach to implementing the OSA on the same day as the OSA received Royal Assent. This roadmap (which updates Ofcom's July 2022 roadmap, as updated in June 2023), sets out a phased approach to consultations around Ofcom's codes of practice and guidance, in the following order:

    • Phase 1: Protecting people from illegal content;
    • Phase 2: Protecting children; and
    • Phase 3: Transparency, user empowerment and other duties on categorised platforms.

    Ofcom's work on Phase 1 has now commenced and its consultation process will launch on 9 November 2023.

    The Government has stated that the majority of the OSA's provisions will commence from the end of December 2023. Further clarification on expected timelines is needed. This will enable regulated organisations to input into the consultations and plan their compliance with the OSA.

    emerging tech timeline v2

    Considerations and consequences

    Given the broad scope of the OSA, businesses should carefully review all of their online content and services.

    The Government has separately published a "Safer platform checklist“ to assist online platforms with taking steps to prepare for the OSA and to help to keep users safe.

    Additional steps businesses can take include:

    • following publication of relevant secondary legislation, determining whether they may be classified as one of the special categories (Category 1, 2A or 2B);
    • carrying out risk assessment of their platforms/operations;
    • reviewing their complaints procedures and terms of service for compliance;
    • considering measures for monitoring online content;
    • considering how they will establish processes/systems to identify and report potential harm to children on their platforms;
    • monitoring Ofcom’s future statements, codes of conduct and guidance on the OSA for developments; and
    • considering whether they will also be regulated by the DSA and, if so, assessing any intersection between their obligations under the DSA and the OSA.

    We will be publishing a briefing on the DSA in the coming weeks, as part of our Emerging Tech Series.

    Authors: David Futter, Partner; Aimi Gold, Senior Associate; Sian Deighan, Associate

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.