On 24 April 2025, Ofcom published its final Children's Safety Codes of Practice under the Online Safety Act 2023, together with its statement, guidance on risk assessments and additional documents.
Providers of services likely to be accessed by UK children must complete and record their assessments of risks posed to children by their services by 24 July 2025. Where they determine there are risks, safety measures must be implemented to mitigate risks and applied from 25 July 2025 (assuming Parliamentary approval of the Codes).
Following the Codes is not mandatory, but services applying the measures in the Codes will be deemed to comply with their OSA obligations relating to children's safety. Other compliance measures may be applied, but services will be required to demonstrate that any alternative measures match the standards set out in the Codes
What are the Children's Safety Codes of Practice?
The Codes (which are subject to Parliamentary approval) are:
They set out 40 measures to help tech firms meet required safety standards. These will apply to sites and apps used by UK children across social media, search services and gaming. The measures will help prevent minors from encountering the most harmful types of content relating to suicide, self-harm, eating disorders and pornography, as well as protecting children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
Measures include:
- Safer feeds – recommender systems posing a medium or high risk of harmful content must ensure algorithms are configured to filter out harmful content from children's feeds.
- Effective age checks – the highest-risk services must use highly effective age assurance to prevent children from accessing harmful material while allowing adults to access legal content. If services are not doing this, they must assume younger children are on their service and have age-appropriate content.
- Fast action – all sites and apps must have review and takedown measures in place to enable them to take action on becoming aware of harmful content.
- More choice and support for children – children must be given more control over their online experience including by allowing them to indicate what content they don't like, block and mute accounts and disable comments on their posts. There must also be supportive content for children who may have encountered or searched for harmful content.
- Easier reporting and complaints – a straightforward way to report content or make complaints must be provided and service providers must respond appropriately. Terms of service must be clear and understandable by children.
Are there other documents to consider?
As has become customary for the Online Safety Act, there are a number of relevant documents which have been published by Ofcom with more to come including:
What's next?
Ofcom will be updating its resources for services to take account of the new children's duties. It is also consulting on expanding the application of some measures in the Illegal Content Codes (eg blocking and muting user accounts and disabling comments) to cover additional services because it now considers it would be proportionate for those measures to apply to certain smaller services likely to be accessed by children. The consultation closes on 22 July 2025, but smaller services should prepare themselves to face extended compliance requirements.
What to do now
In-scope user-to-user and search services should have completed their children's access assessments by 16 April 2025. Where children are likely to access the service, the countdown to compliance with duties to protect children online has begun. Children's risk assessments must now be completed by 24 July 2025 and risk mitigation measures operational by 25 July 2025.
Protection of children is an extremely hot topic at the moment and is the most high-profile issue covered by the OSA. Ofcom will not hesitate to use enforcement powers against non-compliant services which could be extremely damaging to reputation as well as financially costly.