This article examines the safety duties under the Online Safety Act (OSA) and types of content to which they relate, focussing on the duties that apply to user-to-user services. It addresses what has changed in relation to safety duties and content covered during the passage of the OSA through Parliament and looks at the final position.
The Online Safety Act (OSA) aims to protect online users from illegal content and some types of harm caused by user-generated content. It applies to user-to-user and search service providers. It also covers pornographic content services which are beyond the scope of this article. For more on who is in scope, see our article here.
Providers of in-scope services are (among other things) required to comply with safety duties in relation to different types of user-generated content covered by the OSA. The safety duties broadly comprise:
The OSA does not prescribe detailed steps that services must take to protect users. Instead, the focus is on each service taking proportionate measures and putting in place proportionate systems and processes to address the risks posed by the service. Ofcom will recommend more detailed steps services can take to achieve compliance in its codes of practice to be published for consultation shortly with initial consultations due to start on 9 November 2023.
At its inception the Online Safety Bill (OSB) imposed obligations on in-scope services in relation to three types of content – illegal content, content that is harmful to children, and content that is harmful to adults. It applied further sub-categorisations within these content types, the precise details of which were largely left to be designated in secondary legislation. The safety duties varied depending on the content in question.
During the legislative process the Bill underwent the following significant changes in relation to safety duties and categorisation of content:
Category 1 services' duties to protect adults are now limited to adult user empowerment duties
The original OSB required Category 1 services to set out in their terms and conditions how the service would treat each kind of "priority content that is harmful to adults" (which would be defined in secondary legislation), apply the T&Cs consistently, and notify Ofcom of any other (non-priority) content that is harmful to adults present on the service. These safety duties applied in addition to duties to empower adult users to increase their control over the content they might see and other users they would interact with. These provisions were largely removed during the legislative process. The user empowerment duties that remain in the OSA (as described below) take a similar form to those initially proposed in the OSB, but they apply only to particular types of content that are now specified in the OSA itself (rather than being left to secondary legislation).
Primary priority and priority content that is harmful to children are now defined in the OSA
Whereas the OSB originally envisaged that these key concepts would be defined in secondary legislation, the OSA now expressly sets out the types of content falling within these categories (as described below).
There is now a stronger requirement for services likely to be accessed by children to use age verification and age estimation technologies
The original OSB referred to services using age verification or other age assurance as an example of how they might comply with duties to protect children. The OSA now, in addition to this indicative statement, mandates the use of age verification or age estimation (or both) to prevent children from encountering primary priority content that is harmful to children unless the service's T&Cs prohibit the relevant kind of primary priority content for all service users regardless of age. Further, a provider is only entitled to conclude that children are not able to access the service/part of the service if age verification or age estimation or both is used with the result that children are not normally able to access that service/part.Where required to be used, the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.
It is important to understand the various types of content covered by the OSA as different safety duties attach to different kinds of illegal and harmful content. While the types of content to which safety duties will apply are now set out at a high level in the OSA itself (rather than being left to secondary legislation), Ofcom will publish more detailed guidance as to examples of the content it considers to fall within some of these categories. We provide a summary table below.
Different safety duties apply depending on the category of content in question. The main safety duties are summarised in the table below. The OSA expressly provides that relevant factors in determining proportionality in relation to the safety duties include the findings of the most recent risk assessments that services must undertake under the OSA as well as the size and capacity of the service provider. For more information regarding risk assessments, see our article here. For more information on the OSA's obligations in relation to children, see our article here.
The safety duties are supported by obligations for services to operate using systems and processes that allow users and other affected persons to report illegal content and content harmful to children, operate accessible complaints procedures, keep records of measures taken to comply with duties, and review compliance regularly and after making significant changes to any aspect of the service's design or operation.
In addition to these safety duties, all services have an obligation to report CSEA content to the National Crime Agency. Category 1 and Category 2A service providers also have duties to use proportionate systems and processes to mitigate the risks of fraudulent advertising on the service.
More detailed recommended steps for compliance will be set out by Ofcom in its codes of practice. Adhering to them will not be the only way to comply but services will be treated as having complied with particular duties under the OSA if they do. If service providers take alternative measures, the onus will be on them to demonstrate those measures are sufficient. For example, as part of their record-keeping obligations, they will need to record which measures in the relevant code of practice they have not used, the alternative measures they have used, and how the alternative measures comply with the duty in question while also having regard to users' freedom of expression and privacy.
In drafting the codes of practice, Ofcom will need to consult representatives of regulated service providers, users, experts and interest groups, among others – so there will therefore be an opportunity for affected stakeholders to make their views known. We examine Ofcom's preparations for regulation and updated roadmap here. We recommend that all affected services consider participating in relevant consultations and other public policy engagement regarding the codes of practice before these important documents are finalised.
Louise Popple provides a table summary of the main obligations under the OSA.
1 of 9 Insights
Louise Popple looks at the range of businesses caught within the scope of the OSA.
2 of 9 Insights
Megan Lukins looks at the application of the OSA to user-to-user content likely to be accessed by children.
4 of 9 Insights
Debbie Heywood looks at Ofcom's wide range of duties and powers under the Online Safety Act.
5 of 9 Insights
Debbie Heywood looks at what to expect from Ofcom as its powers under the Online Safety Act commence.
6 of 9 Insights
Miles Harmsworth takes a high level look at some of the key overlaps and differences that in-scope digital service providers will need to consider under both regimes.
7 of 9 Insights
Mark Owen looks at requirements to carry out risk assessments under the OSA.
8 of 9 Insights
Timothy Pinto asks whether the OSA has found the right balance between protecting freedom of expression, privacy, journalistic content and content of democratic importance, and protecting online users.
9 of 9 Insights