What's the issue?
The UK's Online Safety Act (OSA) was passed on 26 October 2023. Much of the detail around compliance will only become fully clear once Ofcom guidance and codes of practice are finalised.
Under the OSA, additional obligations apply to Category 1 and Category 2A and 2B services which are defined as follows:
- Category 1 and 2B – user-to-user services that meet respective threshold conditions to be set out in secondary legislation relating to the number of users, service functionality and/or other factors.
Not only do services in scope of the OSA need to wait for the relevant codes of practice and guidance to understand the nature of their duties and how to comply with them, they also need to wait for secondary legislation to determine the threshold conditions and whether or not they are classified as a categorised service and therefore subject to additional requirements.
What's the development?
Ofcom launched phase three of its online safety regulation plan on 25 March 2024. This covers additional duties for categorised services including transparency reporting, user empowerment, fraudulent advertising and user rights.
Phase three will follow a three-step process:
- First, service providers subject to additional duties will be identified.
- Second, Ofcom will consult on draft codes and guidance detailing how categorised services can comply with additional duties: a consultation will take place in summer 2024 on guidance relating to the transparency reporting regime to prioritise implementation in 2025; and in early 2025, there will be a further consultation on additional duties for categorised services. This will be informed by a call for evidence published on 25 March.
Phase three began with a call for evidence to inform the planned early 2025 consultation. The call seeks input from stakeholders on:
- additional terms of service duties
- news publisher content, journalistic content and content of democratic importance and related duties
- user identity verification duties
The call closes on 20 May 2024.
Ofcom has also published its advice to the government on the thresholds which will determine categorisation as a category 1, 2A or 2B service as follows.
Category 1: should apply to services which meet either of the following conditions:
- Condition 1 – uses a content recommender system; and has more than 34m UK users on the user-to-user part of its service, representing around 50% of the UK population; or
Category 2A: should apply to services which meet both of the following criteria:
- is a search service but not a 'vertical' search service; and
Category 2B – should apply to services which meet both of the following criteria:
- allows users to send direct messages; and
What does this mean for you?
The Secretary of State must now set out threshold conditions in secondary legislation, taking Ofcom's advice into account. Once the legislation is passed, Ofcom will gather information as needed from regulated services and produce a published register of categorised services.
It is not certain that Ofcom's threshold recommendations will be accepted without amendment but the recommendations are helpful indicators for organisations trying to understand where they are likely to fit into the online safety regime.
Other OSA news
Ofcom published information on its powers to gather information relating to the death of a child under s101 OSA on 27 March 2024. This section came into force on 1 April 2024. It gives Ofcom powers to request information from the services set out in s100(5)(a-e) of the OSA to support a coroner's or procurator fiscal's investigation into the death of a child. Ofcom can request:
- Content encountered by the child.
- How that content came to be encountered by the child including the role of algorithms or functionalities.
- How the child interacted with the content.
Are we about to see more online safety regulation in the UK?
The news that WhatsApp has changed its terms and conditions to allow children aged 13-15 to use its services prompted renewed debate in the media about access by children to social media, messaging services and even to smartphones.
Reports suggest that the government plans to try and persuade some of the biggest social media businesses to voluntarily alert parents when their children access unsuitable content, and may also be considering mandating greater parental controls or even banning under-16s from social media.
The government may be watching recent developments in Florida whose ban on social media for children under 14 is scheduled to come into effect on 1 July 2024. A new Bill bans under-14s from social media and requires minors aged 14 and 15 to get explicit parental consent to create a user account. Existing accounts which become unlawful on 1 July will need to be closed and data deleted by relevant companies unless it is required to be retained by law.
The Florida Bill also states that companies meeting set criteria which intentionally publish or distribute content harmful to minors, must use age verification to block access to the content by under-18s. Obviously the legal framework is very different in the USA and the Bill (an outlier in the US) is expected to face legal challenges on the basis that it breaches the First Amendment right to free speech. If it survives challenges it will be instructive to see whether it is enforceable in practice.
If the UK government were to proceed with restricting children's access to social media, there would be an additional set of online safety, not to mention data protection considerations for in-scope businesses and, no doubt, still more for regulators like Ofcom and the ICO to do in this area. The ICO has set out its priorities when it comes to protecting children online during 2024-25 which, unsurprisingly, include working closely with Ofcom in its capacity as regulator of the Online Safety Act. It seems unlikely, however, that the OSA will fully resolve the online safety debate.