The Online Safety Bill (OSB) imposes obligations on in-scope services regarding three types of content: illegal content, content that is harmful to children, and content that is harmful to adults. It then applies further sub-categorisations within these content types. The "safety duties" that services must comply with vary depending on the content in question.
All services in scope of the OSB have obligations concerning illegal content. This comprises content which, or the or the possession, viewing, accessing, publication or dissemination of which, amounts to:
Whereas, in the previous draft of the Bill, content was considered illegal if the service provider had reasonable grounds to believe there was a relevant offence, this mental element has now been removed and only content actually amounting to an offence is considered illegal.
Services likely to be accessed by children have obligations regarding content that is harmful to children. This comprises:
Services falling within Category 1 under the Online Safety Bill have obligations concerning content that is harmful to adults (more information regarding Ofcom's role in categorising regulated services is available here). This comprises:
Different safety duties apply depending on the category of content in question. Here, we focus on the duties that apply under the Online Safety Bill to user-to-user services; the duties that apply to search engines are similar but not identical. The following sections set out the safety duties as provided in the OSB, followed by observations as to how services may be expected to comply with them.
The safety duties are closely tied to the outcome of the risk assessments that services must undertake under the Bill (more information regarding risk assessments is available here).
All services in-scope of the Online Safety Bill must comply with the following duties:
Services likely to be accessed by children must comply with the following duties regarding those parts of the service that it is possible for children to access:
For Category 1 services that have obligations concerning content harmful to adults, there is no general duty to mitigate risk of harm and no specific content minimisation duties. However, these services do need to:
These duties reflects the idea that adults should be empowered to keep themselves safe online.
All services have obligations to operate using systems and processes that allow users and other affected persons to report illegal and harmful content. They are also required to operate accessible complaints procedures, including to allow complaints by users whose content has been taken down or restricted, or have suffered other sanctions as a result of their content.
While outside the scope of this article, it's also worth mentioning at this point that the 2022 version of the OSB introduces a new legal duty which requires the largest social media platforms and search engines to take steps to prevent fraudulent paid-for advertising from appearing on their services. See here for more.
Aside from this fairly limited guidance in the Bill itself, how might services expected to comply and what steps can be taken now to prepare?
The OSB provides that Ofcom must publish Codes of Practice describing recommended steps for compliance with duties. Services will be treated as having complied with their obligations under the OSB if they take the steps described in a Code of Practice.
Following the Codes of Practice will not be the only way for services to comply with their duties but may (depending on their contents) be the easiest, or at least the most certain, way to ensure compliance. In drafting the codes of practice, Ofcom will need to consult representatives of regulated service providers, users, experts and interest groups – there will therefore be an opportunity for affected stakeholders to make their views known before these important documents are finalised.
In the meantime, the government has published voluntary interim codes of practice for terrorism and CSEA content. The examples of good practice outlined in these Codes set a fairly high bar in terms of content minimisation measures (eg suggesting that content be identified using in-house or third-party automated tools in conjunction with human moderation) and also contain more aspirational obligations (eg regarding industry cooperation). This might reflect the seriousness of these particular categories of content and/or the voluntary nature of the Codes of Practice but are a useful guide to best practice for services preparing for compliance.
The government has also published several guidance documents regarding online safety as a general matter. These are not linked to the OSB but may be indicative of the types of measures services may be expected to take under it. The guidance is clear that the preference is for safety by design – ie services should build features and functionality that promote user safety and prevent the dissemination of illegal and harmful content rather than merely taking down content when aware of it. Measures suggested by the guidance include:
The guidance also includes a specific 'one stop shop' for businesses for child online safety.
There are a number of areas where further legislation or guidance is required in order for services to more fully understand their obligations in relation to illegal and harmful content under the Online Safety Bill. These include regulations to be made by the Secretary of State setting out "priority" categories of content harmful to children and adults and, importantly, Ofcom's Codes of Practice, which will operate as a form of 'safe harbour' for compliance. Services wishing to prepare for and engage with the Online Safety legislation in the meantime can take note of existing government guidance and participate (potentially via an industry group) in Ofcom's consultation (when it is published) regarding the Codes of Practice.
To discuss any of the issues raised in this article, please reach out to a member of our Technology, Media & Communications team.
Louise Popple looks at the range of businesses caught within the scope of the OSB.
1 of 6 Insights
Mark Owen looks at the role of risk assessments and risk profiles in determining the scope and application of the Online Safety Bill.
3 of 6 Insights
Tim Pinto looks at tensions between preventing online harm and protecting the fundamental rights of freedom of expression and privacy.
4 of 6 Insights
Jo Joyce and Alex Walton look at the Online Safety Bill and recent guidance on protecting children online.
5 of 6 Insights
Debbie Heywood looks at Ofcom's wide range of duties and powers under the Online Safety Bill.
6 of 6 Insights