3 of 6

13 June 2022

The Online Safety Bill - the UK's answer to addressing online harms – 3 of 6 Insights

Risk assessments under the Online Safety Bill

Mark Owen looks at the role of risk assessments and risk profiles in determining the scope and application of the Online Safety Bill.

  • Quick read

Mark Owen


Read More

The Online Safety Bill (OSB) takes a systemic approach to how providers of certain user-to-user or search services are to deal with online safety, rather than providing a prescriptive list of harmful activities and how those are to be dealt with.

Fundamental to this approach is that the service providers in scope will be obliged to carry out substantial and ongoing risk assessments, together with implementing mitigation measures in respect of those risks, such as by modifications to services. 

Many providers already carry out various risk assessments, but the OSB will impose new and wider duties around them. As a result, risk assessments will become an even more crucial part of how many digital services are operated.

What do risk assessments entail?

The OSB and its supporting documentation contain only high level and somewhat vague directions as to what the various risk assessment duties will entail and how they may be satisfied. Though it is already clear that the duties will be both substantial and continuous, it is to be expected that further detail will be agreed upon including as part of the guidance and Codes of Practice for compliance which the regulator, Ofcom, will be developing (see here for more about Ofcom's role).

A sensible and commonly accepted approach will be vital both to providers knowing how they can comply and to acceptance of the new regime by all stakeholders, including the general public.

The obligations will be part of the relevant service providers' duties of care, and will include the following duties, matching the three categories which appear throughout the OSB:

  • illegal content risk assessment duty
  • children's risk assessment duties
  • adults' harmful content risk assessment for certain services.

Providers will be obliged to consider risk profiles for their type of service developed by Ofcom (discussed below), then apply those specifically to how their service operates. This includes how the technology operates, the functionality it offers (defined according to a lengthy list of features) and how the design and operation of the whole service may increase or reduce risks. 

This goes beyond consideration of the technology and extends into analysis of how the service is used and by whom, how likely the users in question are to encounter the harmful or illegal content and the nature and severity of the possible harm if they do. It will also look at the provider's governance systems and even its whole business model. 

It remains to be seen to what extent this approach gives rise to an existential threat to some providers or whether what the government has in mind are more detailed modifications.

Ofcom's role: risk profiles

The first set of risk assessments are to be carried out by the regulator Ofcom to "identify, assess and understand" the risks of harm regulated services may give rise to. These will follow the same three categories. Ofcom then has to prepare "risk profiles" for different types of service. These will partly be based on the Ofcom risk assessments but also on a catch-all phrase, the "characteristics of the services". 

Although the intention appears to be that Ofcom's risk assessments will be of types of service rather than those offered by specific providers, "characteristics" of services include elements which sound very subjective to particular services. These include their functionalities, governance, business model and user base. As with other parts of the OSB the drafting is somewhat circular and vague, but the intention seems clear; Ofcom will have a broad discretion to decide what risks there may be.

Different levels of risk assessment

Risk assessments which factor in the Ofcom risk profiles are to be carried out by providers of user-to-user services. All services must carry out the assessments of illegal content risks and risks to children. Regarding children, the provider has to go further and supply Ofcom with details of certain types of content which is harmful to children. Category 1 services are in addition to carry out assessments of the risks for adults from harmful content. 

Search services will have to conduct illegal content and children's risk assessments (we discuss children's risk assessments in more detail here).

Timings of risk assessments

The systemic approach depends upon a continual process of risk assessment, both by Ofcom and service providers. Both are under an obligation to keep their risk assessments up to date and so this is to be seen as a continuous process, requiring specific dedicated resource within an organisation, rather than something that can be carried out once and has then been satisfied.

Providers will also have to conduct risk assessments at specific times:

  • within three months of Ofcom publishing its risk assessments
  • when Ofcom makes a significant change to its risk profile applicable to that service
  • before the launch of relevant new services
  • when making any significant relevant change to the design or operation of the service, and
  • when the service changes to become a "regulated service".

Next steps

The design of risk profiles and risk assessments will be among the first steps in implementing the new online safety regime. This will be a major task both for Ofcom and for service providers but in order that risks and potential mitigations are properly understood, it will be crucial for service providers to be prepared to engage with the process from the outset.

Find out more

To discuss the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Communications.

Return to


Go to Interface main hub