3 von 9

2. November 2023

The UK's Online Safety Act – 3 von 9 Insights

The UK's Online Safety Act – safety duties

Xuyang Zhu and Danielle Owusu give an overview of safety duties in relation to the different types of illegal and harmful content covered by the OSA.

Mehr
Autoren

Xuyang Zhu

Senior Counsel

Read More

Danielle Owusu

Trainee Solicitor

Autoren

Xuyang Zhu

Senior Counsel

Read More

Danielle Owusu

Trainee Solicitor

This article examines the safety duties under the Online Safety Act (OSA) and types of content to which they relate, focussing on the duties that apply to user-to-user services.  It addresses what has changed in relation to safety duties and content covered during the passage of the OSA through Parliament and looks at the final position.

The Online Safety Act (OSA) aims to protect online users from illegal content and some types of harm caused by user-generated content. It applies to user-to-user and search service providers.  It also covers pornographic content services which are beyond the scope of this article. For more on who is in scope, see our article here.

What are safety duties?

Providers of in-scope services are (among other things) required to comply with safety duties in relation to different types of user-generated content covered by the OSA.  The safety duties broadly comprise:

  • duties in relation to illegal content, which apply to all in-scope services
  • duties protecting children, which apply to services likely to be accessed by children, and
  • adult user empowerment duties, which apply only to Category 1 services.

The OSA does not prescribe detailed steps that services must take to protect users.  Instead, the focus is on each service taking proportionate measures and putting in place proportionate systems and processes to address the risks posed by the service. Ofcom will recommend more detailed steps services can take to achieve compliance in its codes of practice to be published for consultation shortly with initial consultations due to start on 9 November 2023.

What has changed during the passage of the Act?

At its inception the Online Safety Bill (OSB) imposed obligations on in-scope services in relation to three types of content – illegal content, content that is harmful to children, and content that is harmful to adults.  It applied further sub-categorisations within these content types, the precise details of which were largely left to be designated in secondary legislation.  The safety duties varied depending on the content in question.

During the legislative process the Bill underwent the following significant changes in relation to safety duties and categorisation of content:

Category 1 services' duties to protect adults are now limited to adult user empowerment duties 

The original OSB required Category 1 services to set out in their terms and conditions how the service would treat each kind of "priority content that is harmful to adults" (which would be defined in secondary legislation), apply the T&Cs consistently, and notify Ofcom of any other (non-priority) content that is harmful to adults present on the service.  These safety duties applied in addition to duties to empower adult users to increase their control over the content they might see and other users they would interact with.  These provisions were largely removed during the legislative process.  The user empowerment duties that remain in the OSA (as described below) take a similar form to those initially proposed in the OSB, but they apply only to particular types of content that are now specified in the OSA itself (rather than being left to secondary legislation).

Primary priority and priority content that is harmful to children are now defined in the OSA 

Whereas the OSB originally envisaged that these key concepts would be defined in secondary legislation, the OSA now expressly sets out the types of content falling within these categories (as described below).

There is now a stronger requirement for services likely to be accessed by children to use age verification and age estimation technologies 

The original OSB referred to services using age verification or other age assurance as an example of how they might comply with duties to protect children.  The OSA now, in addition to this indicative statement, mandates the use of age verification or age estimation (or both) to prevent children from encountering primary priority content that is harmful to children unless the service's T&Cs prohibit the relevant kind of primary priority content for all service users regardless of age. Further, a provider is only entitled to conclude that children are not able to access the service/part of the service if age verification or age estimation or both is used with the result that children are not normally able to access that service/part.Where required to be used, the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.

Illegal and harmful content

It is important to understand the various types of content covered by the OSA as different safety duties attach to different kinds of illegal and harmful content. While the types of content to which safety duties will apply are now set out at a high level in the OSA itself (rather than being left to secondary legislation), Ofcom will publish more detailed guidance as to examples of the content it considers to fall within some of these categories. We provide a summary table below.

View table

Safety duties

Different safety duties apply depending on the category of content in question.  The main safety duties are summarised in the table below.  The OSA expressly provides that relevant factors in determining proportionality in relation to the safety duties include the findings of the most recent risk assessments that services must undertake under the OSA as well as the size and capacity of the service provider.  For more information regarding risk assessments, see our article here.  For more information on the OSA's obligations in relation to children, see our article here.

The safety duties are supported by obligations for services to operate using systems and processes that allow users and other affected persons to report illegal content and content harmful to children, operate accessible complaints procedures, keep records of measures taken to comply with duties, and review compliance regularly and after making significant changes to any aspect of the service's design or operation.   

In addition to these safety duties, all services have an obligation to report CSEA content to the National Crime Agency.  Category 1 and Category 2A service providers also have duties to use proportionate systems and processes to mitigate the risks of fraudulent advertising on the service.

View table

Next steps for compliance

The OSA provides that safety duties in relation to illegal content and protecting children apply across all areas of a service, including the way it is designed, operated and used as well as content present on it.  Services will need to take measures (if proportionate) in areas including regulatory compliance and risk management arrangements, the design of functionalities, algorithms and other features, policies on terms of use and user access, content moderation and take-down, functionalities allowing users to control the content they encounter, user support measures, and staff policies and practices, among others. 

More detailed recommended steps for compliance will be set out by Ofcom in its codes of practice.  Adhering to them will not be the only way to comply but services will be treated as having complied with particular duties under the OSA if they do.  If service providers  take alternative measures, the onus will be on them to demonstrate those measures are sufficient.  For example, as part of their record-keeping obligations, they will need to record which measures in the relevant code of practice they have not used, the alternative measures they have used, and how the alternative measures comply with the duty in question while also having regard to users' freedom of expression and privacy.   

In drafting the codes of practice, Ofcom will need to consult representatives of regulated service providers, users, experts and interest groups, among others – so there will therefore be an opportunity for affected stakeholders to make their views known.  We examine Ofcom's preparations for regulation and updated roadmap here. We recommend that all affected services consider participating in relevant consultations and other public policy engagement regarding the codes of practice before these important documents are finalised.

Zurück zur

Hauptseite

Zurück zur Interface Hauptseite