Author
Xuyang Zhu

Xuyang Zhu

Senior associate

Read More
Author
Xuyang Zhu

Xuyang Zhu

Senior associate

Read More

30 July 2021

Download – Online Safety Bill – 2 of 6 Insights

Online Safety Bill – illegal and harmful content and safety duties

  • In-depth analysis

The Online Safety Bill (OSB) imposes obligations on in-scope services regarding three types of content: illegal content, content that is harmful to children, and content that is harmful to adults. It then applies further sub-categorisations within these content types. The "safety duties" that services must comply with vary depending on the content in question. 

Illegal and harmful content

table-1

Illegal content

All services in scope of the OSB have obligations concerning illegal content. This comprises content which, or the dissemination of which, the service provider has reasonable grounds to believe amounts to:

  • an offence relating to terrorism or child sexual exploitation and abuse (CSEA); these offences are specified in Schedules to the Bill 
  • an offence specified in regulations to be made by the Secretary of State; this type of content is defined as "Priority illegal content" and the UK government's response to the Online Harms White Paper consultation (Online Harms Consultation Response) indicates that this category may include hate speech and the sale of illegal drugs and weapons, or
  • any other offence of which the victim is an individual or individuals except offences relating to the infringement of IP rights, the safety or quality of goods, or the performance of a service by a person not qualified to perform it. 

Notably, illegal content is not defined by reference to what actually is or is not an offence; instead, content will be considered illegal if the service provider has reasonable grounds to believe there is a relevant offence. The OSB does not provide any further detail on what this means – this may become clearer as further guidance is published and/or as Ofcom begins to make enforcement decisions. 

Content harmful to children

Services likely to be accessed by children have obligations regarding content that is harmful to children. This comprises:

  • Content that amounts to "primary priority content" or "priority content" – content falling within both categories will be designated in regulations to be made by the Secretary of State. The Online Harms Consultation Response indicates that these might include violent and/or pornographic content.
  • Content that the service provider has reasonable grounds to believe would give rise to a material risk of significant adverse physical or psychological impact on a child of ordinary sensibilities of any age except where the impact flows from the content's potential financial impact, the safety or quality of goods featured in the content, or the way in which a service featured in the content may be performed. When assessing the impact of content, if the content would particularly affect a child with a particular characteristic or who is a member of a particular group, then the child of ordinary sensibilities should be assumed to have that characteristic or be in that group. 

Content harmful to adults

Services falling within Category 1 under the Online Safety Bill have obligations concerning content that is harmful to adults (more information regarding Ofcom's role in categorising regulated services is available here). This comprises: 

  • Content that amounts to "priority content" – content falling within this category will be designated in regulations to be made by the Secretary of State. The Consultation Response indicates that this might include abuse that doesn't amount to an offence, and content about eating disorders, self-harm and suicide.
  • Content that the service provider has reasonable grounds to believe would give rise to a material risk of significant adverse physical or psychological impact on an adult of ordinary sensibilities except where the impact flows from the content's potential financial impact, the safety or quality of goods featured in the content, or the way in which a service featured in the content may be performed. When assessing the impact of content, if the content would particularly affect an adult with a particular characteristic or who is a member of a particular group, then the adult of ordinary sensibilities should be assumed to have that characteristic or be in that group. 

Safety duties

Different safety duties apply depending on the category of content in question. Here, we focus on the duties that apply under the Online Safety Bill to user-to-user services; the duties that apply to search engines are similar but not identical. The following sections set out the safety duties as provided in the OSB, followed by observations as to how services may be expected to comply with them.

The safety duties are closely tied to the outcome of the risk assessments that services must undertake under the Bill (more information regarding risk assessments is available here).

table-2

Illegal content

All services in-scope of the Online Safety Bill must comply with the following duties:

  • take proportionate steps to mitigate and effectively manage the risks of harm to individuals as identified in the illegal content risk assessment
  • use proportionate systems and processes designed to minimise the presence of priority illegal content, the length of time it is present, and its dissemination (the legislation does not specifically provide for content minimisation duties for terrorism and CSEA content – these will likely be set out in the mandatory codes of practice that will replace the current interim voluntary codes for these categories of content (see below))
  • use proportionate systems and processes designed to swiftly take down illegal content when notified or otherwise aware of it, and
  • specify clearly in terms and conditions how individuals are to be protected from illegal content, addressing terrorism and CSEA content, priority illegal content, and other illegal content separately, and apply terms and conditions consistently. 

The government's Consultation Response suggests that the systems and processes services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures. 

Content harmful to children

Services likely to be accessed by children must comply with the following duties regarding those parts of the service that it is possible for children to access:

  • take proportionate steps to mitigate and effectively manage the risks of harm, and the impact of harm, to children in different age groups as identified in the children's risk assessment
  • use proportionate systems and processes designed to prevent children of any age from encountering primary priority content
  • use proportionate systems and processes designed to protect children in age groups assessed to be at risk from the relevant content from encountering priority and any non-designated content identified in a risk assessment
  • specify clearly in terms and conditions how children are to be prevented or protected from encountering harmful content, addressing each kind of primary priority and priority content separately (eg violent content and pornographic content would need to be individually addressed if they fall within these categories), and apply terms and conditions consistently. 

Content harmful to adults

For Category 1 services that have obligations concerning content harmful to adults, there is no general duty to mitigate risk of harm and no specific content minimisation duties. However, these services do need to have and consistently apply terms and conditions that specify how priority content, and any other harmful content identified in a risk assessment, will be dealt with by the service. This reflects the idea that adults should be empowered to keep themselves safe online. 

All illegal and harmful content – reporting and redress

All services have obligations to operate using systems and processes that allow users and other affected persons to report illegal and harmful content. They are also required to operate accessible complaints procedures, including to allow complaints by users whose content has been taken down or restricted, or have suffered other sanctions as a result of their content. 

How to comply – current government guidance and future codes of practice

Many of the safety duties contained in the Online Safety Bill – particularly the general duties to mitigate harm and content minimisation duties – are vague and broadly drafted. The OSB also places an emphasis on proportionality, which is to be assessed considering the findings of the service's risk assessments. This reflects the intention for the OSB to be a risk- and principles-based regulation, which is focussed on services putting in place processes and systems appropriate for their risk profile rather than rigidly following detailed prescriptive rules. 

Given this, how are services expected to comply and what steps can be taken now to prepare? 

Codes of Practice

The OSB provides that Ofcom must publish Codes of Practice describing recommended steps for compliance with duties. Services will be treated as having complied with their obligations under the OSB if they take the steps described in a Code of Practice and provided that terrorism and CSEA content aren't prevalent or persistently present on the service.

Following the Codes of Practice will not be the only way for services to comply with their duties but may (depending on their contents) be the easiest, or at least the most certain, way to ensure compliance. In drafting the codes of practice, Ofcom will need to consult representatives of regulated service providers, users, experts and interest groups – there will therefore be an opportunity for affected stakeholders to make their views known before these important documents are finalised.

In the meantime, the government has published voluntary interim codes of practice for terrorism and CSEA content. The examples of good practice outlined in these Codes set a fairly high bar in terms of content minimisation measures (eg suggesting that content be identified using in-house or third-party automated tools in conjunction with human moderation) and also contain more aspirational obligations (eg regarding industry cooperation). This might reflect the seriousness of these particular categories of content and/or the voluntary nature of the Codes of Practice but are a useful guide to best practice for services preparing for compliance.

Current government guidance

The government has also published several guidance documents regarding online safety as a general matter. These are not linked to the Online Safety Bill but may be indicative of the types of measures services may be expected to take under it. The guidance is clear that the preference is for safety by design – ie services should build features and functionality that promote user safety and prevent the dissemination of illegal and harmful content rather than merely taking down content when aware of it. Measures suggested by the guidance include:

  • requiring users to verify their accounts
  • using age assurance technology to verify user age
  • defaulting to the highest available safety settings (eg having content shared by users visible only to the user's friends with location turned off)
  • limiting functionality for children or preventing children from departing from default high safety settings
  • making reporting functionality available in different locations (eg in private messaging functionality as well as when viewing public posts) and prompting users to make a report when suspicious activity is detected
  • using automated safety technology to identify illegal and harmful content, supported by human moderation
  • prompting users to review their settings
  • labelling content that has been fact-checked by an official or trustworthy source, and
  • not allowing algorithms to recommend harmful content.

The guidance also includes a specific 'one stop shop' for businesses for child online safety.

Next steps

There are several areas where further legislation or guidance is required for services to more fully understand their obligations concerning illegal and harmful content under the Online Safety Bill. These include regulations to be made by the Secretary of State setting out 'priority' categories of content and, importantly, Ofcom's Codes of Practice, which will operate as a form of 'safe harbour' for compliance.

Services wishing to prepare for and engage with the Online Safety legislation in the meantime can take note of existing government guidance and participate (potentially via an industry group) in Ofcom's consultation (when it is published) regarding the Codes of Practice.

Find out more

To discuss any of the issues raised in this article, please reach out to a member of our Technology, Media & Communications team.

In this series

Technology, media & communications

Online Safety Bill – are you caught?

Briefing

by Louise Popple

Technology, media & communications

Online Safety Bill – illegal and harmful content and safety duties

In-depth analysis

by Xuyang Zhu

Technology, media & communications

Risk assessments under the Online Safety Bill

Quick read

by Mark Owen

Technology, media & communications

The regulation of child safety online – an update

In-depth analysis

by Jo Joyce, Alex Walton

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe