Overview

For businesses within the scope of the Online Safety Act, achieving compliance and swiftly adapting to extensive new guidance is a significant challenge. The contrasting approaches in Europe to online safety and the tensions with the US over content moderation further complicate the landscape. This briefing explores the OSA’s scope, its phased rollout, key duties for providers, enforcement measures and the broader geopolitical backdrop.

The OSA aims to create a safer online environment by mitigating the risks of illegal content and content which is harmful to children, while balancing freedom of expression concerns. The OSA does not give Ofcom the power to require the removal of specific, harmful content.  Instead, it imposes duties on services to put in place effective processes and systems to guard against that content.

Scope of the Online Safety Act

Types of services in scope

The OSA does not apply to all online platforms, but small and medium enterprises are not exempt.  Ofcom's initial estimate was that 100,000 businesses are likely to be in scope.  The three types of services within its scope are:

1. User-to-user services (U2U services), which enable content generation, uploading, or sharing by users, whether individuals or entities, thereby allowing that content to be encountered by other users.

Examples of U2U services

  • Social media services.
  • Video-sharing services.
  • Messaging services.
  • Marketplaces and listing services.
  • Dating services.
  • Review services.
  • Gaming services.
  • File sharing services.
  • Audio sharing services.
  • Discussion forums and chat rooms.
  • Information sharing services, such as online encyclopaedias and question and answer services.
  • Fundraising services.
  • Services with user-generated pornographic content.

2. Search services, specifically those offering search functionalities across multiple websites or databases.

3. Services which publish or display pornography (whereas porn created and shared by users counts as U2U).

Extraterritorial reach

The OSA extends beyond UK-based services. It applies to any online service with "links to the UK". If a service targets the UK market, such as marketing to, or generating revenue from UK users, or has a significant number (which is not defined) of UK users, it falls within the OSA’s reach.  Moreover, if there is a material risk of significant harm to UK individuals, the OSA applies.

Certain services and content are exempt…

  • Internal business tools like intranets or business collaboration tools
  • Communications by email, SMS or MMS, or 1:1 live aural communications
  • Comments on the content the service provides or limited functionality reactions to other users' comment such as “likes”, “yes/no” voting, numerical ratings or emojis.

Phased Implementation

Although the OSA hit the statute books in October 2023, its implementation is phased and relies upon the finalisation of codes of practice and secondary legislation to trigger obligations and to fill in the detail.

It is being rolled out in three phases:

  • Phase 1 (illegal harms duties).
  • Phase 2 (child safety, pornography, and the protection of women and girls).
  • Phase 3 (duties on categorised services – see section 4 below). 

The codes of practice comprise Ofcom’s non-binding recommendations. Abiding by them will ensure compliance i.e. they are a “safe harbour”.  If a provider deviates from a code of practice and chooses to take “alternative measures”, it must record the steps taken and explain how compliance is achieved.

The diagram below shows the key deadlines for the next 12 months – at the time of writing, deadlines for all businesses within the scope of the OSA are rapidly approaching: 

16 March 2025. Deadline to complete an initial illegal harms risk assessment and begin implementing safety measures.

16 April 2025. Deadline to complete children's access assessments to see if under-18s are likely to use the services.

It will be important for providers to demonstrate that they are making concerted efforts towards achieving compliance, even if full compliance is not achievable by the deadlines.  Ofcom has referred to a 6-month period of forbearance for services to fully implement safety measures but has made it very clear that it will not hesitate to take earlier action against the most egregious cases of non-compliance.

On 3 March 2025 Ofcom launched an enforcement programme to monitor whether providers are complying with their illegal content risk assessment duties and record keeping duties.  The regulator issued formal information requests to several large services, as well as to smaller but high-risk sites, mandating the submission of their illegal harms risk assessments by 31 March 2025.

Duties imposed on all service providers

One of the main objectives of the OSA, and the focus for the first phase, is to tackle illegal content, with a particular focus on "priority offences".  Even though the duties apply to all in-scope providers, expectations are scaled according to the size and risk level of the service.

What is illegal content?

Illegal content under the OSA includes any material constituting a "relevant offence". These offences fall into two categories:

  • Priority Offences: Listed in schedules 5, 6, and 7 of the OSA, including terrorism, child sexual exploitation and abuse, assisting suicide, harassment, drugs and firearms offences, immigration and human trafficking, sexual exploitation, financial crimes, and more.
  • Other Offences: Any other criminal offences where the victim is an individual.

Service providers must determine whether content is illegal content, where there are "reasonable grounds to infer" that the content amounts to a relevant offence, using all "relevant information that is reasonably available".  Ofcom has issued guidance to help with illegal content decisions.

Duties for U2U and search services are slightly different and are addressed separately in the OSA and codes of practice, although we have wrapped them up together here: 

  1. Illegal content risk assessments. By 16 March 2025 all regulated services must carry out an initial illegal content risk assessment that is “suitable and sufficient” in respect of illegal content, taking into account risk factors set out in the OSA and Ofcom's Risk Profiles.  This assessment must then be kept up to date (annually and whenever a significant change is made to the design or operation of the service).  Ofcom has made available a compliance tool  to help with risk assessments and recommends a 4-step process in the guidance.  Providers are required to consider the 17 different types of priority illegal content, identify whether there is a risk of each taking place on the service and assign a "risk level" to each.

  2. Safety measures. The safety measures are shaped by the risks identified in the risk assessment.  Similar to GDPR's "privacy-by-design" principle, the OSA is about safety-by-design.  From 16 March 2025, regulated providers must start implementing proportionate measures relating to the design and operation of the service to prevent users from encountering priority illegal content, mitigate the risk of services being misused for committing priority offences, reduce harm from illegal content, minimise exposure time, and quickly remove flagged illegal content.

  3. Clear and accessible provisions in the terms of service must explain how individuals are protected from illegal content, including information about any proactive technology used and complaints processes and procedures.

  4. Content Reporting. Regulated providers must implement easy-to-use mechanisms for users and "affected persons" to report illegal content and content harmful to children.

  5. Complaints processes. They must also operate easy-to-use and transparent complaints procedures for various types of complaint and take appropriate action to address the complaints.

  6. Freedom of Expression and Privacy. Providers are required to balance safety measures with protecting users' rights to freedom of expression and privacy. 

  7. Record Keeping. There are duties concerning keeping records of risk assessments and measures taken to comply with particular duties. Ofcom has published guidance: Record Keeping and Review Guidance.

Measures vary by reference to size and risk of services

Ofcom's codes of practice contain much more detail on the recommended measures, differentiating measures by service size and risk. "Large services" are services with an average user base exceeding 7 million active users per month. A service is "multi-risk" if users face significant risks from at least two different types of harm.  Larger or multi-risk services face stricter compliance requirements, like automated hash-matching for detecting child sexual abuse material and annual board-level risk assessments.

Duties specifically for providers of Category 1, 2A, and 2B services

There are additional duties, the focus of Phase 3 implementation, which apply to specific categories of provider - in essence, high-risk and high-reach services.  Categorisation will depend on a service reaching threshold conditions, based on factors such as monthly active UK users and functionalities that amplify content, as defined in secondary legislation (currently in draft). Ofcom plans to publish the register of categorised services in summer 2025.

What additional duties apply to these categorised services?

  • Annual Transparency Reports. These providers must produce annual transparency reports detailing compliance efforts, risk mitigation actions, and incidents of illegal content handled.
  • Enhanced Risk Assessments. Category 1 (largest U2U) and 2A (largest search) providers face additional obligations for detailed and frequent risk assessments specifically targeting illegal content and high-risk areas, as well as further record-keeping obligations.
  • Fraudulent Advertising Prevention. They must also establish systems to prevent fraudulent advertisements on their platforms, including monitoring and removing such content swiftly.
  • Protection of Democratic Content. Category 1 providers have enhanced duties to safeguard content crucial to democratic processes, such as news publisher content and journalistic content.
  • Category 1 providers have additional requirements in respect of terms of service, user empowerment and user identity verification options.

Duties in respect of children

Is the service likely to be accessed by children?

By 16 April 2025, all regulated providers must assess whether their service is likely to be accessed by under 18s (see Ofcom's decision tree below).  It is only possible to determine that a service is not likely to be accessed by children from the outset if the service has "highly effective" age assurance measures in place.  It is then a question of whether the service has a "significant number" of UK child users or is of a kind likely to attract a significant number of UK child users.  There is no statutory definition of "significant", but it is highly context specific - there can still be a "significant" number even if there is a relatively small proportion of children using the service. The regulated provider must keep a written record of the children's access assessment.

Child risk assessments and mitigation measures

If the service is “likely to be accessed by children”, the provider must carry out and record a “suitable and sufficient” children’s risk assessment to assess the risk of physical or psychological harm to children for each type of harmful content. 

The OSA lists 3 types of harmful content:

  • Primary priority content, e.g. pornography, suicide, self-harm, and eating disorders.

  • Priority content, e.g. bullying and serious violence.

  • Non-designated content, which is other content presenting a risk of substantial harm to an appreciable number of UK children.

The provider must keep assessments under review and mitigate the risks identified.  These duties are likely to come into effect in July 2025.

Enforcement

Ofcom has said that its enforcement strategy under the OSA will be proportionate and risk-based, focusing on serious and strategically substantial breaches. In addition to significant investigatory and fining powers (maximum fines of the greater of £18 million or 10% of the provider's global turnover for the most serious breaches), it has various other enforcement tools at its disposal, including, in cases of extreme non-compliance, the power to suspend services. Senior management may also face criminal liability for repeated or severe non-compliance.

The international outlook for online safety

Interaction with the Digital Services Act (DSA)

A significant number of providers regulated by the OSA will also be subject to the EU’s DSA, given both have extraterritorial reach. Despite their shared goal of enhancing online safety and the fact that Ofcom aims to take an approach that is "coherent with counterpart regulatory frameworks", the OSA and DSA diverge in their scope and approach in several respects.    

Key differences between the OSA and DSA

  • Scope of Services: The OSA focuses on user-to-user services, search engines, and porn sites, whereas the DSA encompasses a broader array of online intermediaries and platforms. Each regime imposes additional obligations on the very largest providers.  It does not necessarily follow however that all providers designated as "very large online platforms" (VLOPs) or very large search engines (VLOSEs) under the DSA will be categorised as Category 1 or 2A providers respectively under the OSA.

  • Content Coverage: The OSA is narrower, focusing on illegal content and content harmful to children but excluding IP, trading and product safety laws. The DSA includes a wider range of illegal content, covering EU and member state laws.

  • Risk Assessments: All in-scope OSA services must perform risk assessments, whereas, under the DSA, only VLOPs must do so.

  • Proactive Measures: The OSA demands proactive content measures from all providers; the DSA requires them for VLOPs, focusing otherwise on notice and takedown procedures. The OSA, however, offers more flexibility than the DSA as to how compliance can be achieved.

  • Advertisement Regulations: The DSA imposes broader advertising obligations and targeting restrictions, as opposed to the OSA’s more limited scope on fraudulent ads.

Whereas in the US…

In the first few weeks of the Trump administration, major US social media companies have watered down or abandoned their content moderation systems that deal with harmful material.  In February, the White House threatened to impose tariffs on European countries that have implemented a digital services tax and mentioned, as potentially also giving rise to retaliatory measures, “regulations imposed on United States companies by foreign governments that could inhibit the growth or intended operation of United States companies”.  So far, the UK has downplayed suggestions of amendments to online safety laws in return for a favourable deal on tariffs.

Next steps

With a diverse range of businesses impacted by the OSA, there is no one-size-fits-all compliance strategy, however:

  • Businesses should prioritise the timely completion of illegal content risk assessments and children's access assessments.
  • Staying up to date with Ofcom's guidance is essential for navigating the OSA's requirements effectively.
  • Where appropriate, using the tools provided by Ofcom, such as decision trees and risk assessment templates, as well as (where relevant) leveraging learnings from DSA and video-sharing platform compliance, can help streamline compliance efforts.
  • Cross-functional engagement within the business and integrating online safety measures early in the design process (given that product changes often require significant lead times) is crucial for achieving successful compliance.

Our Technology and Commercial Transactions team has extensive expertise and experience in advising digital businesses and online platforms on complex and challenging compliance programs.  We can help you to navigate the complexities of online safety requirements, with pragmatic, commercially focussed advice and solutions. 

get in touch

Read Louisa Chambers Profile
Louisa Chambers
Read Helen Reddish Profile
Helen Reddish
Back To Top Back To Top chevron up