The Online Safety Act (OSA) will regulate the safety of users online in relation to user-generated content. It will require in-scope online user-to-user and search services to protect their users from certain types of illegal content and harms. The OSA contains wide ranging and complex obligations, adopting a risk-based approach which may make it hard for providers to understand whether they are caught and what to do if they are.
Enter Ofcom, which will be overseeing and enforcing the regime – giving it significantly increased responsibilities and powers, not least of which is the ability to fine companies up to £18 million or 10% of qualifying revenue if they fail in their duties.
Perhaps the starting point for the entire Online Safety regime is the requirement on Ofcom to create risk assessments. First, risk of harm to individuals presented by regulated services must be identified, assessed and understood, considering levels and different kinds of risk posed by illegal content to UK individuals, and by content that is harmful to children of different age groups. The findings of risk assessments must be published in a register of risks.
Ofcom must then develop and publish risk profiles for different types of regulated services based on the characteristics of the service, risk levels and other matters identified in the relevant risk assessment. It will be required to produce guidance to accompany these profiles.
This is crucially important as it is against these risk profiles, that companies will need to self-assess as part of their own risk assessments to decide where they fit in and what they need to do.
In-scope businesses will be categorised by Ofcom according to the number of users of a service, its functionalities and the risk of harmful content spreading. The highest risk user-to-user services (most likely the largest social media companies) will be in Category 1, while other services will be placed into Categories 2A and B if they meet user and functionality criteria and the relevant conditions for categorisation to be set out in secondary legislation.
It's Ofcom's job to establish, maintain and publish a register of regulated services by category of services they consider likely to meet the respective category criteria. This means that it is Ofcom which makes the initial decision about in-scope service providers, although its decisions can be appealed.
Ofcom also has powers to impose fees on regulated service providers. Companies above a (to be determined) threshold based on global annual revenue will have to notify Ofcom and pay an annual fee in accordance with a Statement of Principles to be produced by Ofcom following guidance to be issued by the Secretary of State. The threshold is likely to be high enough to mean this will only apply to a small number of businesses.
The OSA places considerable emphasis on a risk-based approach. This makes sense given the vast array of content and services it potentially covers, but it also makes compliance a challenge. Again, it's Ofcom's job to demystify the process.
In addition to all sorts of guidance on, for example, risk profiles, classification of content harmful to children, user empowerment, protecting women and girls, user identity verification, transparency, freedom of expression and privacy in terms of service, and enforcement powers (see below), Ofcom is required to produce regularly reviewed and updated codes of practice setting out steps to help relevant providers of regulated services comply across a wide range of duties, including regarding terrorism content, CSEA (Child Sexual Exploitation and Abuse) content, as well as the other duties on providers of Part 3 services, such as fraudulent advertising, priority illegal content, children's online safety, freedom of expression and related rights and protections, content reporting and complaints procedures. The Secretary of State has a fair amount of oversight in relation to the codes. Adopting practices set out in the codes will not be mandatory but using those measures will create a presumption of compliance with the respective duty. Service providers will have to explain how their chosen approach meets the duties in the OSA if they choose an alternative approach.
Until this additional material is produced, it is difficult for service providers to understand exactly what is required of them although the UK government published two Interim Codes of Practice in December 2020. This is partly because the OSA itself is an extremely lengthy piece of legislation, but also because flesh needs to be added to the bones of the definitions and outline obligations and any exemptions (which in some cases are at Ofcom's discretion). Hopefully, this is what Ofcom's guidance and codes of practice will provide in due course. You can read more about Ofcom's proposed timelines here.
Where Ofcom has reason to believe a provider of a regulated service is not complying with its duties in relation to illegal terrorism or CSEA content, it can require them to use specific technology to help them identify and remove the content and, potentially to prevent individuals from encountering CSEA content where it considers it necessary and proportionate to do so. In some cases, private as well as public content is covered. These powers have been highly controversial as they are seen as circumventing encryption and potentially requiring proactive monitoring of content. The government has said it will not bring them into force until such time as appropriate technology exists.
The process would, however, start with Ofcom issuing a warning notice. Under the notice, the provider will be required to use specific technology accredited by Ofcom (or a body appointed by Ofcom). If the provider is already using it, then Ofcom can specify how to use it more effectively. It can also issue further notices requiring additional or alternative technology be used.
All notices must contain stipulated information and can last for up to 36 months. They can only place requirements on regulated services in the UK or as they impact UK users. Ofcom must publish guidance about these notices, and an annual report about the exercise of its functions in relation to them, which must also list out technology which meets or is being developed to meet required standards.
Ofcom will be able to require information relating to a child's social media account if requested to do so by a coroner and will be required to produce expert reports where requested by the coroner. Ofcom will also be able to share information with coroners without business consent.
Ofcom has the power to request information from pretty much anyone it thinks can provide the information required to help it carry out or decide how to carry out its duties under the OSA provided it acts proportionately. It may also require a relevant senior manager to be named, and conduct audits.
There are various offences associated with failure to provide information or comply with an audit notice, or for providing knowingly or recklessly false information. Named individuals can also commit offences and be liable for the company's failures.
Ofcom may also commission reports on compliance failures to help it understand risk and ways to mitigate that risk, and require interviews as part of investigations into compliance failures, which it can compel providers to participate in.
Ofcom has a wide range of enforcement powers under the legislation. These include:
Ofcom is required to publish details of enforcement actions unless they are commercially sensitive or otherwise inappropriate for publication in Ofcom's opinion. It is also required to publish guidance on how it intends to use its enforcement powers.
Ofcom has several wider policy-focused duties to carry out as a result of the OSA and the OSA also amends sections on Ofcom's duties under the Communications Act. The Secretary of State can make statements of strategic priorities which Ofcom needs to respond to and, where appropriate, act upon.
It is required to set up an advisory committee of stakeholders and experts to provide advice to Ofcom on dealing with issues including disinformation and misinformation, and media literacy (which it has a duty to promote under the Communications Act). It also has to carry out research and issue every relevant service provider with a notice to provide a transparency report about their service on which it will base its own transparency reports summarising its conclusions on patterns and trends, steps considered to be good practice, and anything else relevant.
This is in addition to its annual report and other reports it is required or chooses to make on online safety matters, which include reviews and reports of the incidence of different types of harmful content appearing on a variety of services, reports about the use of age assurance, a report about the use of app stores by children, and a statement about freedom of expression and privacy.
Finally, it has an extended duty to promote media literacy which involves identifying and taking any steps it thinks appropriate as well as, of course, producing guidance.
Ofcom's decisions are subject to appeal.
Eligible entities (to be determined under Regulations) can also make a super-complaint to Ofcom that any feature of one or more regulated services or the conduct of service providers presents material risk of significant harm to users, freedom of expression, privacy or any other significant adverse effect.
Complaints can be made against a single regulated service or provider only where Ofcom considers the complaint is particularly important or relates to a particularly large number of users or members of the public.
Again, Ofcom will be required to publish guidance.
We're increasingly seeing legislation take a risk-based approach, particularly where complex technology or issues are involved and where a wide range of use cases are covered, from the (UK) GDPR to the EC's draft AI Regulation. The OSA follows the same approach.
This places a considerable burden on Ofcom as the appointed regulator. This is partly administrative, but also policy-based, as guidance and codes of practice develop to help businesses understand the nuances of compliance.
Ofcom welcomed its new role and has already made considerable progress, aided by the lengthy passage of the OSA through Parliament. There is, however, no doubt that this legislation will add considerably to its workload given its scale and complexity.
You can find out more about what to expect from Ofcom and when here.
Louise Popple provides a table summary of the main obligations under the OSA.
1 of 9 Insights
Louise Popple looks at the range of businesses caught within the scope of the OSA.
2 of 9 Insights
Xuyang Zhu and Danielle Owusu give an overview of safety duties in relation to the different types of illegal and harmful content covered by the OSA.
3 of 9 Insights
Megan Lukins looks at the application of the OSA to user-to-user content likely to be accessed by children.
4 of 9 Insights
Debbie Heywood looks at what to expect from Ofcom as its powers under the Online Safety Act commence.
6 of 9 Insights
Miles Harmsworth takes a high level look at some of the key overlaps and differences that in-scope digital service providers will need to consider under both regimes.
7 of 9 Insights
Mark Owen looks at requirements to carry out risk assessments under the OSA.
8 of 9 Insights
Timothy Pinto asks whether the OSA has found the right balance between protecting freedom of expression, privacy, journalistic content and content of democratic importance, and protecting online users.
9 of 9 Insights