What's the issue?
The UK's Online Safety Act (OSA) places obligations on providers of online user-to-user and search services (or parts of them) likely to be accessed by children as part of its broader online safety regime. These duties involve conducting risk assessments, protecting children from illegal content and content likely to be harmful to them, and transparency, reporting and redress duties.
The duty to “prevent” children encountering primary priority content harmful to them requires providers to use age verification/estimation (unless Ts&Cs indicate such content is prohibited and the policy applies to all users). Even where not required, age verification and estimation are given as examples of what can be used to comply with all of the obligations imposed regarding protection of children.
The OSA is regulated by Ofcom but age verification and estimation will likely involve processing personal data and, as such, the UK's data protection regulator, the ICO, also has an interest (as discussed here). The ICO has played a leading role in protecting children from online harms to date through its Children's Code - a set of overarching principles that information society service providers are required to implement where their services are "likely to be accessed by children" – ie under-18s - in order to embed age appropriate design where personal data is processed in the provision of the services.
This means that the ICO and Ofcom will have areas of regulatory overlap relating to the OSA (as we discuss here) and, in order to reduce the burden on stakeholders and ensure consistency, they need to work together.
The EU is also focusing on online safety with its Digital Services Act (DSA) covering similar albeit not identical ground to the OSA (see here for a comparison overview of the two).
What's the development?
UK
Under the banner of the Digital Regulatory Cooperation Forum (DRCF), Ofcom and the ICO published a Joint Statement on Collaboration on the Regulation of Online Services on 1 May 2024. This sets out:
- Collaboration themes - areas where the regulators will work together.
- Companies of mutual interest - how they will identify when they are looking at the same issues or services.
Collaboration themes highlighted in this statement will be particularly important to businesses that have safety duties under the OSA involving the processing of (children's) personal data, and whose services are also caught under the ICO's Children's Code. They will initially cover:
- proactive tech and relevant AI tools
- default settings and geolocation settings for child users
- online safety privacy duties
EU
On 13 May 2024, the European Commission services responsible for enforcing the DSA, signed an administrative arrangement with Ofcom. This sets out plans for cooperation on enforcement on areas of common interest which include the protection of children online, age-appropriate design technologies, transparency, risk assessments, and the impact of algorithms on systemic risks for society.
Co-operation is expected to consist of technical expert dialogues, training, studies and coordinated research products, and sharing best practice.
What does this mean for you?
There are overlaps between the Children's Code and the OSA, whether or not the definitions and criteria are the same, although not all organisations in scope of the Children's Code will be caught by the OSA. There are, however, considerable differences in terms of the obligations each engages. Service providers will be looking for Ofcom and the ICO to provide clarity and guidance on how to meet OSA safety duties in a data protection-compliant manner.
Similarly, many businesses come within the scope of both the OSA and the DSA. Those dealing with an unwieldy bundle of obligations will be pleased to see more information about how the EU and UK regulators will cooperate in this area.
See here for more on the OSA, the EU's Digital Services Act and related online safety issues.
Related articles