The online safety discourse in the UK has been dominated by the Online Safety Act over the last few years but this is by no means the only initiative in a crowded space. Building on existing protections, the UK's data protection regulator, the ICO, has focused heavily on children's online safety. Beyond this, measures have been introduced to protect consumers online, to update data protection rules, and to create secure digital identities. So where has the UK got to and what can we expect?
Online Safety Act
The UK's Online Safety Act (OSA) was passed on 26 October 2023 and regulates user-generated online content, with a focus on illegal content and content that is harmful to children. Not all the detail around compliance has been finalised although things are ramping up as Ofcom Codes of Practice and guidance are finalised and safety duties commence.
Over the course of 2024, Ofcom moved towards an operational regime with a multitude of consultations. The culmination of these efforts for many of those in scope came on 16 December 2024, when Ofcom published its Statement on protecting people from illegal harms online. This is a decision on the Illegal Harms Codes and guidance which set out what user-to-user and search service providers need to do to comply with their illegal harms safety duties under the UK's Online Safety Act (read more here). Another significant step came on 16 January 2025, when Ofcom published its Age Assurance and Children's Access Statement with accompanying guidance on highly effective age assurance and Children's Access Assessments.
Secondary and related legislation
Over the course of 2024, secondary legislation was introduced to bring the majority of the OSA into force and, in some cases, to amend or complement it. This includes by adding new offences relating to sharing intimate images without consent (revenge porn) and creating deepfakes with sexual content (Ofcom published a discussion paper on Deepfake defences – Mitigating the harms of deceptive deepfakes on 23 July 2024), as well as to deal with the transition of the regulation of video sharing providers from the Communications Act to the OSA. On 7 January 2025, the government confirmed it will also introduce further offences around sexually explicit deepfakes and taking intimate images without consent or installing equipment with intent to commit such offences, albeit not into the OSA directly.
The Online Safety Act 2023 (Commencement No 4) Regulations 2024 were made on 10 December 2024 and bring into force sections of the OSA relating to pornography providers on 17 January 2025.
The draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 were laid before Parliament in December 2024. These are important as they will define the thresholds at which services become categorised and therefore subject to additional obligations under the OSA.
The draft Data (Use and Access) Bill also proposes some amendments to the OSA in relation to research information and information accessible to coroners on the death of a child.
The OSA regime now takes off in earnest for in-scope service providers.
- Illegal harms duties - in-scope providers must assess the risk of illegal harms on their services by 16 March 2025. The Codes of Practice are expected to complete the Parliamentary process in time for the 17 March 2025 deadline at which point the illegal harms safety duties become enforceable by Ofcom. This means that providers need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and activity. If services take alternative measures, the onus will be on them to demonstrate these are sufficient to achieve compliance, so, for the vast majority of in-scope services, the Codes of Practice and related regulatory documents and guidance will be the easiest route to compliance. Ofcom is "ready to take enforcement action if providers do not act promptly to address the risk on their services".
- Children's access assessments – all in-scope user-to-user and search services (Part 3 services) must carry out a children's access assessment to determine whether or not their service or part of it is likely to be accessed by children. These must be completed by 16 April 2025. Unless the services are using HEAA, Ofcom anticipates most of these services will have to conclude they are likely to be accessed by children within the meaning of the OSA. They will then need to comply with the children's risk assessment and safety duties.
- Measures to protect children on social media and other user-to-user services – Ofcom will publish its Protection of Children Codes and risk assessment guidance in April 2025. Services likely to be accessed by children will then need to conduct a children's risk assessment within three months ie by July 2025 and then implement protective measures in line with the Protection of Children Codes.
- Services that allow pornography – Part 5 services which publish their own pornographic content including some Generative AI tools must take immediate steps to introduce age checks and introduce HEAA by 17 January 2025. Services that allow user-generated pornographic content (Part 3 services) must have HEAA in place by July 2025.
- Categorised services - the register of categorised services will be published in Summer 2025 with proposals on additional duties for categorised services due not later than early 2026.
Ofcom has also been focusing on risks associated with GenAI, publishing a discussion paper on how red teaming can help mitigate the risk of harms caused by GenAI in July.
ICO
The ICO has led the way in protection of children online with its Children's Code which took effect in September 2021, setting standards for online services when using children's data. The ICO has also been focusing on other areas of online safety and this was a priority area in 2024, not least in support of the OSA. During 2024, the ICO updated its age assurance Opinion, published guidance on content moderation and cooperated with Ofcom in relation to the OSA.
In April 2024, the ICO set out its priorities when it comes to protecting children online during 2024-25. The ICO's Children's Code strategy will focus on: default privacy and geolocation settings; profiling children for targeted advertisements; using children's information in recommender systems; and using information of children under 13. Calls for views were launched on the latter two elements and on 29 October 2024, the ICO published a statement on its work to protect children online following research that showed that for many children, data is their only currency and they see giving it to apps and services to help them socialise as a necessary exchange.
The ICO expects to update its Children's Code Strategy in 2025. See here for more on children's data and online harms.
Consumer protection
In 2024, the UK passed the Digital Markets Competition and Consumers Act 2024 (DMCCA). This reforms the UK's competition and consumer protection regimes. One of the focuses of the DMCCA is to tackle certain harmful online practices which can lead to consumers taking decisions they might not otherwise make. Fake reviews and drip pricing will become automatically unfair ie blacklisted practices, and new obligations are introduced to protect consumers from subscription traps. The DMCCA also allows the CMA to take consumer protection enforcement action directly rather than having to apply to the courts, and to impose significant financial penalties. Among other things, this is intended to create a safer online environment for consumers. Most of the consumer protection sections of the DMCCA are expected to commence in April 2025 with the subscription contracts regime coming into effect in Spring 2026.
What further steps can we expect?
Children's exposure to social media continues to be a focus of concern. The UK government recently launched a study into children's use of social media, and there is talk of a ban on under-16s using social media and/or having phones in school.
Another area of focus to facilitate online safety in the broader sense is trusted digital identities. In the UK, the Office for Digital Identities & Attributes recently published an updated pre-release of the UK digital identity and attributes trust framework setting out government-backed rules and standards for trustworthy and secure digital identity services. The Data (Use and Access) Bill, meanwhile, gives powers to the Secretary of State to set up a regime for trusted digital identity verification and to recognise EU and potentially other third country trust services. This is alongside initiatives to give individuals easy access to their own public sector records.
Finally of course, we need to mention AI. The EU's AI Act has attempted to deal with AI safety-related risks and the UK is likely to consult on draft legislation on AI safety with a focus on frontier AI in Spring this year. At the moment, rumours are swirling as to what it is likely to cover and what it will leave out, but the impact of AI on the online world, whether in terms of profiling, decision-making, copyright infringement, deepfakes, mis- and dis-information, or other risks, may be in scope. In the meantime, Ofcom has also been focusing on risks associated with GenAI, publishing a discussion paper on how red teaming can help mitigate the risk of harms caused by GenAI in July, 2024 and the ICO published an outcomes report setting out its policy positions on data protection in generative AI at the end of last year.
Filling in the gaps
On 20 November 2024, the government published its draft Statement of Strategic Priorities (SSP). Ofcom must have regard to these when carrying out its regulatory functions, including enforcement, and will be required to report back to the Secretary of State on the actions it has taken against the priorities. The reports against the SSP will help inform government action on online safety.
The government has said it will keep the new online safety rules under review, particularly in relation to social media platforms and children. Appearing on the BBC's Laura Kuensberg Show on 12 January 2025, Peter Kyle, Secretary of State for Science, Innovation and Technology, described the Online Safety Act as an inherited landscape "where we have seen a very uneven, unsatisfactory legislative settlement". He stopped short of committing to changing the OSA or publishing further online safety legislation, but said he was open-minded on the subject. One thing he was clear on was that good use of the OSA's enforcement powers will be made.
Creating a safe online environment without stifling either it or free speech is an ongoing challenge and a difficult balance to maintain. It is made more complex by the fact that different jurisdictions take differing approaches. While the UK's OSA focuses on user-generated content, the EU's Digital Services Act regulates the obligations and accountability of online intermediaries and platforms regarding illegal content, products and services, while promoting transparent advertising. Some of these elements are dealt with in the UK under consumer protection legislation, and the EU looks set to emulate part of the UK's regime in its planned Digital Fairness Act. Whether or not it is possible to make the online environment safe(er) while keeping negative consequences at bay remains to be seen as the incoming and recent regulation takes effect. There is, however, no doubt that online safety is going to continue to be a major concern for the foreseeable future both in the UK and around the world.