6 of 6

13 June 2022

The Online Safety Bill - the UK's answer to addressing online harms – 6 of 6 Insights

Regulating and enforcing the Online Safety Bill – Ofcom's powers and duties

Debbie Heywood looks at Ofcom's wide range of duties and powers under the Online Safety Bill.

  • Briefing
More
Author

Debbie Heywood

Senior Professional Support Lawyer

Read More

The Online Safety Bill (OSB) will introduce a statutory duty of care on providers of certain user-to-user and search services, to protect their users from harm. It contains wide ranging and complex obligations, adopting a risk-based approach which may make it hard for providers to understand whether they are caught and what to do if they are.

Enter Ofcom, which will be overseeing and enforcing the regime – giving it significantly increased responsibilities and powers, not least of which is the ability to fine companies up to £18 million or 10% of qualifying revenue if they fail in their new duty of care.

Risk assessments and profiles

Perhaps the starting point for the entire Online Safety regime is the requirement on Ofcom to carry out risk assessment profiles. First, risk of harm to individuals presented by regulated services must be identified, assessed and understood, considering levels and different kinds of risk posed by illegal content to UK individuals, and by harmful content to children and to adults.

Ofcom must then develop risk profiles for different types of regulated services based on the characteristics of the service including functionality, user base, business model, governance and systems. It will be required to produce guidance to accompany these profiles.

This is crucially important as it is against these risk profiles, that companies will need to self-assess to decide where they fit in and what they need to do.

Register of regulated services

In-scope businesses will be categorised by Ofcom according to the number of users of a service, its functionalities and the risk of harmful content spreading. The highest risk user-user services (most likely the largest social media companies) will be in Category 1, while other services will be placed into Categories 2A and B if they meet user and functionality criteria and the relevant conditions for categorisation set out in Annex 4.

It's Ofcom's job to establish and maintain a register of regulated services by category of services they consider likely to meet the respective category criteria. This means that it is Ofcom which makes the initial decision about in-scope service providers, although its decisions can be appealed.

Ofcom also has powers to impose fees on regulated service providers. Companies above a (to be determined) threshold based on global annual revenue will have to notify Ofcom and pay an annual fee. The threshold is likely to be high enough to mean this will only apply to a small number of businesses.

Guidance and Codes of Practice

The Online Safety Bill places considerable emphasis on a risk-based approach. This makes sense given the vast array of content and services it potentially covers, but that also makes compliance a challenge. Again, it's Ofcom's job to demystify the process.

In addition to all sorts of guidance on, for example, risk profiles, children's access requirements, user identity verification, freedom of expression and privacy, and elements like technology transfer assessments and enforcement powers (see below), Ofcom is required to produce regularly reviewed and updated Codes of Practice setting out steps to help relevant providers of regulated services comply with their duties across a wide range of compliance requirements, including regarding terrorism content and CSEA (Child Sexual Exploitation and Abuse) content, and fraudulent advertising. The Secretary of State has a fair amount of oversight in relation to the Codes. Adopting practices set out in the Codes will not necessarily be mandatory but using those measures will create a presumption of compliance with the respective duty.

Until this additional material is produced, it is difficult for service providers to understand exactly what is required of them although the UK government published two Interim Codes of Practice in December 2020. This is partly because the OSB itself is an extremely lengthy piece of legislation, but also because flesh needs to be added to the bones of the definitions and outline obligations and any exemptions (which in some case are at Ofcom's discretion). Hopefully, this is what Ofcom's guidance and Codes of Practice will provide in due course.

Notices to deal with terrorism and/or CSEA content

Where Ofcom has reason to believe a provider of a regulated service provider is not complying with its duties in relation to illegal terrorism or CSEA content, it can require the provider to use specific technology to help it identify and remove the content. 

The process starts with Ofcom issuing a warning notice. Under the notice, the provider will be required to use specific technology accredited by Ofcom (or a body appointed by Ofcom). If the provider is already using it, then Ofcom can specify how to use it more effectively. It can also issue further notices requiring additional or alternative technology be used.

All notices must contain stipulated information and can last for up to 36 months. They can only place requirements on regulated services in the UK or as they impact UK users. Ofcom must publish guidance about these notices, and an annual report about the exercise of its functions in relation to them, and setting out technology which meets or is being developed to meet required standards.

Information, investigations and interviews

Ofcom has the power to request information from pretty much anyone it thinks can provide the information required to help it carry out or decide how to carry out its duties under the OSB. It may also require a relevant senior manager to be named, and conduct audits.

There are various offences associated with failure to provide information or comply with an audit notice, or providing knowingly or recklessly false information. Named individuals can also commit offences and be liable for the company's failures.

Ofcom may also commission reports on compliance failures to help it understand risk and ways to mitigate that risk, and require interviews as part of investigations into compliance failures, which it can compel providers to participate in.

Enforcement powers

Ofcom has a wide range of enforcement powers under the legislation. These include:

  • Notices of contravention – these can be issued to service providers and individuals. Notices must set out which enforceable requirements (as set out in the legislation) need to be complied with. A provisional notice will precede a confirmation decision which can be issued where there has been a failure to remedy the issue identified in the provisional notice. Confirmation decisions will set out the duty to which the determination relates and when the identified risk must be mitigated and the duty complied with.  Under certain circumstances, it may include a requirement to use proactive technology.
  • Penalties – penalties of up to 10% of annual global qualifying revenue or £18 million (whichever is higher) can be imposed by a confirmation decision or a penalty notice. Further daily rate penalties can be imposed for ongoing non-compliance in some areas and additional penalties may be awarded for failure to comply with notices on terrorism and/or CSEA content, or with confirmation decisions, and for non-payment of fees.  Criteria for assessing the amount of penalties which might be imposed are set out in Schedule 12.
  • Business disruption measures – Ofcom may apply to the court for a service restriction order to impose requirements on a provider of a regulated service or one ancillary to it. It can also apply for access restriction orders where a service restriction or interim service restriction order did not fail to prevent significant harm to individuals in the UK or would be unlikely to do so if made.

Ofcom is required to publish details of enforcement actions, unless they are commercially sensitive or otherwise inappropriate for publication in Ofcom's opinion. It is also required to publish guidance on how it intends to use its enforcement powers.

Committees, reports, transparency and promotion of media literacy

Ofcom has several wider policy-focused duties to carry out as a result of the OSB and the OSB also amends sections on Ofcom's duties under the Communications Act.

It is required to set up an advisory committee on disinformation and misinformation to provide advice to Ofcom on dealing with those issues. It also has to carry out research and issue every relevant service provider with a notice to provide a transparency report about their service on which it will base its own transparency reports summarising its conclusions on patterns and trends, steps considered to be good practice, and anything else relevant.

This is in addition to its annual report and other reports it is required or chooses to make on online safety matters, which include reviews and reports of the incidence of different types of harmful content appearing on a variety of services, and a statement about freedom of expression and privacy.

Finally, it has a duty to promote media literacy which involves identifying and taking any steps it thinks appropriate as well as, of course, producing guidance.

Appeals and super-complaints

Ofcom's decisions are subject to appeal. 

Eligible entities (to be determined under regulations) can also make a super-complaint to Ofcom that any feature of one or more regulated services or the conduct of service providers presents material risk of significant harm to users, freedom of expression, privacy or any other significant adverse effect. 

Complaints can be made against a single regulated service or provider only where Ofcom considers the complaint is particularly important or relates to a particularly large number of users or members of the public.

Again, Ofcom will be required to publish guidance.

Regulating a risk-based approach

We're increasingly seeing legislation take a risk-based approach, particularly where complex technology or issues are involved and where a wide range of use cases are covered, from the GDPR to the EC's draft AI Regulation. The OSB follows the same approach.

This places a considerable burden on the appointed regulator (Ofcom in this case). This is partly administrative, but also policy-based, as guidance and Codes of Practice develop to help companies understand the nuances of compliance.

Ofcom has welcomed its new role, but there is no doubt it will add considerably to its workload given the scale and complexity of regulating not just illegal but also harmful online content.

Find out more

To discuss the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Communications team.

Return to

home

Go to Interface main hub