The Online Safety Bill (OSB) imposes obligations on in-scope services regarding three types of content: illegal content, content that is harmful to children, and content that is harmful to adults. It then applies further sub-categorisations within these content types. The "safety duties" that services must comply with vary depending on the content in question.
Illegal and harmful content
All services in scope of the OSB have obligations concerning illegal content. This comprises content which, or the dissemination of which, the service provider has reasonable grounds to believe amounts to:
Notably, illegal content is not defined by reference to what actually is or is not an offence; instead, content will be considered illegal if the service provider has reasonable grounds to believe there is a relevant offence. The OSB does not provide any further detail on what this means – this may become clearer as further guidance is published and/or as Ofcom begins to make enforcement decisions.
Services likely to be accessed by children have obligations regarding content that is harmful to children. This comprises:
Services falling within Category 1 under the Online Safety Bill have obligations concerning content that is harmful to adults (more information regarding Ofcom's role in categorising regulated services is available here). This comprises:
Different safety duties apply depending on the category of content in question. Here, we focus on the duties that apply under the Online Safety Bill to user-to-user services; the duties that apply to search engines are similar but not identical. The following sections set out the safety duties as provided in the OSB, followed by observations as to how services may be expected to comply with them.
The safety duties are closely tied to the outcome of the risk assessments that services must undertake under the Bill (more information regarding risk assessments is available here).
Illegal content
All services in-scope of the Online Safety Bill must comply with the following duties:
The government's Consultation Response suggests that the systems and processes services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.
Services likely to be accessed by children must comply with the following duties regarding those parts of the service that it is possible for children to access:
For Category 1 services that have obligations concerning content harmful to adults, there is no general duty to mitigate risk of harm and no specific content minimisation duties. However, these services do need to have and consistently apply terms and conditions that specify how priority content, and any other harmful content identified in a risk assessment, will be dealt with by the service. This reflects the idea that adults should be empowered to keep themselves safe online.
All services have obligations to operate using systems and processes that allow users and other affected persons to report illegal and harmful content. They are also required to operate accessible complaints procedures, including to allow complaints by users whose content has been taken down or restricted, or have suffered other sanctions as a result of their content.
Many of the safety duties contained in the Online Safety Bill – particularly the general duties to mitigate harm and content minimisation duties – are vague and broadly drafted. The OSB also places an emphasis on proportionality, which is to be assessed considering the findings of the service's risk assessments. This reflects the intention for the OSB to be a risk- and principles-based regulation, which is focussed on services putting in place processes and systems appropriate for their risk profile rather than rigidly following detailed prescriptive rules.
Given this, how are services expected to comply and what steps can be taken now to prepare?
The OSB provides that Ofcom must publish Codes of Practice describing recommended steps for compliance with duties. Services will be treated as having complied with their obligations under the OSB if they take the steps described in a Code of Practice and provided that terrorism and CSEA content aren't prevalent or persistently present on the service.
Following the Codes of Practice will not be the only way for services to comply with their duties but may (depending on their contents) be the easiest, or at least the most certain, way to ensure compliance. In drafting the codes of practice, Ofcom will need to consult representatives of regulated service providers, users, experts and interest groups – there will therefore be an opportunity for affected stakeholders to make their views known before these important documents are finalised.
In the meantime, the government has published voluntary interim codes of practice for terrorism and CSEA content. The examples of good practice outlined in these Codes set a fairly high bar in terms of content minimisation measures (eg suggesting that content be identified using in-house or third-party automated tools in conjunction with human moderation) and also contain more aspirational obligations (eg regarding industry cooperation). This might reflect the seriousness of these particular categories of content and/or the voluntary nature of the Codes of Practice but are a useful guide to best practice for services preparing for compliance.
The government has also published several guidance documents regarding online safety as a general matter. These are not linked to the Online Safety Bill but may be indicative of the types of measures services may be expected to take under it. The guidance is clear that the preference is for safety by design – ie services should build features and functionality that promote user safety and prevent the dissemination of illegal and harmful content rather than merely taking down content when aware of it. Measures suggested by the guidance include:
The guidance also includes a specific 'one stop shop' for businesses for child online safety.
There are several areas where further legislation or guidance is required for services to more fully understand their obligations concerning illegal and harmful content under the Online Safety Bill. These include regulations to be made by the Secretary of State setting out 'priority' categories of content and, importantly, Ofcom's Codes of Practice, which will operate as a form of 'safe harbour' for compliance.
Services wishing to prepare for and engage with the Online Safety legislation in the meantime can take note of existing government guidance and participate (potentially via an industry group) in Ofcom's consultation (when it is published) regarding the Codes of Practice.
To discuss any of the issues raised in this article, please reach out to a member of our Technology, Media & Communications team.
We look at tensions between preventing online harm and protecting the fundamental rights of freedom of expression and privacy.
1 of 6 Insights
We look at Ofcom's extensive role in making the UK's incoming regime on online harms work.
2 of 6 Insights
We look at the role of risk assessments and risk profiles in determining the scope and application of the Online Safety Bill.
3 of 6 Insights
We look at the Online Safety Bill and recent guidance on protecting children online.
4 of 6 Insights
We look at who might be caught by the incoming Online Safety Bill.
5 of 6 Insights
Return to