Ofcom has had another busy month, producing a range of consultations and reports as it pushes on with implementation of the Online Safety Act.
However, the Act is already under considerable scrutiny and continues to be criticised for not doing enough to tackle harmful but legal online content. Here we look at what's happened over the last few weeks and at what to expect.
Online age check requirements now in force
On 25 July 2025, services which allow pornography became subject to requirements to use highly effective age gating to prevent children from accessing porn. Similarly, sites allowing Primary Priority Content (harmful content including self-harm, suicide, eating disorder or extreme violence/gore) must now use highly effective age checks.
Statement of Strategic Priorities for Online Safety designated
Ofcom's Final Statement of Strategic Priorities for Online Safety was designated on 2 July 2025, having been laid in draft before Parliament on 8 May 2025. The SSP sets out the government's focus areas for online safety. Ofcom is required to have regard to them as it implements the Online Safety Act.
As before, the government's priorities are:
- transparency and accountability
- inclusivity and resilience
- technology and innovation – foster innovation of online safety technologies.
On 25 July 2025, Ofcom published its response to the SSP, setting out actions to date and summarising relevant work to be carried out over the coming year. Unsurprisingly, this focuses on protecting children and the vulnerable and ensuring that enforcement of the OSA is both proportionate and effective.
Fees and penalties
On 26 June 2025, Ofcom published its policy statement on the implementation of the online safety fees and penalty regime following consultation. This sets out Ofcom's final decisions and the reasoning behind them. Ofcom has proposed:
- To define qualifying worldwide revenues (QWR) as a firm's global revenue from relevant parts of regulated services rather than all revenue attributable to the UK for the purpose of calculating fees or maximum penalties
- Fees will be equivalent to approximately 0.02% to 0.03% of QWR each year
- The QWR threshold at which companies will be required to pay fees should be set at £250 million per year
- Any company meeting the above threshold will be exempted where its UK-referable revenue is less than £10 million p/a.
Both the threshold for fees and the possible exemption threshold will ultimately be decided by the Secretary of State but Ofcom considers any threshold figure of between £200-500 million will be appropriate.
Three statutory instruments are required to give effect to Ofcom's statement, two of which (on notification and QWR) were submitted to Parliament on 26 June. On 18 July 2025, Ofcom published a consultation on guidance for providers to help them calculate their qualifying worldwide revenue (QWR) relating to The Online Safety Act 2023 (Qualifying Worldwide Revenue) Regulations 2025. Responses are requested by 17:00 on 10 September 2025.
Later this year, Ofcom will consult on a Statement of Charging Principles ahead of the first due payments in financial year 2026-27.
Enforcement
Ofcom has been actively investigating porn providers and their preparations for compliance with highly effective age provisions which came into force on 25 July 2025. On 26 June 2025, Ofcom announced that the UK's major porn providers had agreed to bring in highly effective age checks by the deadline, following initial investigations by Ofcom. Online firms which publish their own pornography have been required to protect children from it since November 2024. Ofcom underlines that enforcement in this area will be one of its priorities.
On 9 July 2025, Ofcom launched an enforcement programme to monitor industry compliance with children's risk assessment duties under the Act.
Ofcom also announced a monitoring and impact programme focused on the biggest platforms where children spend most time. A comprehensive review of these platforms' efforts to assess risks to children must be submitted to Ofcom by 7 August 2025. Details of practical actions to keep children safe must be disclosed to Ofcom by 30 September. Ofcom will track children's online experiences and will take enforcement action if evidence suggests platforms are failing to comply with their children's safety obligations.
On 24 July 2025, Ofcom announced a new age assurance enforcement programme building on work undertaken by its 'small but risky' taskforce. This will specifically target sites dedicated to the dissemination of Primary Priority Content. The programme is expected to run for at least four months during which time Ofcom may decide to open separate compliance investigations.
Separately, Ofcom closed its investigations into Kick Entertainment without making any conclusions.
Transparency reporting regime
On 21 July 2025, Ofcom published its Statement on Online Safety Transparency Reporting and Transparency Reporting Guidance. Under the OSA, categorised services will have additional duties including to publish an annual transparency report in response to an Ofcom-issued transparency notice. Separately, Ofcom must publish its own transparency report identifying insights and conclusions drawn from the transparency reports provided by the categorised services. Ofcom's statement sets out how it will determine what transparency reports should cover and how it will produce its own transparency reports, as well as its approach to ensuring compliance with transparency duties.
Reports
Ofcom report on researchers' access to information about online safety
On 8 July 2025, Ofcom published a report on independent researcher access to online safety matters from providers of regulated online services under the OSA. The report looks at how and to what extent such access should be given and at current constraints on information sharing. It proposes three potential policy options to improve access: clarifying existing rules, creating new duties enforced by a backstop regulator, and enabling and managing access via an intermediary. Ofcom proposes a layered, flexible approach.
Discussion paper on deepfakes
On 11 July 2025, Ofcom published a second discussion paper on Deepfake Defences which looks more closely at the merits of 'attribution measures' including watermarking, provenance metadata schemes, AI labels and context annotations. The paper discusses the pros and cons of the measures and looks at how to deploy them successfully.
Video-sharing platform rules
On 25 July 2025, the new video-sharing platform regime under s210 of the OSA came into force and Part 4B of the Communications Act 2003 was repealed in accordance with the Online Safety Act 2023 (Commencement No 6) Regulations 2025. From that date, services which previously fell under Part 4B CA are wholly regulated under the OSA, signifying the end of the transition regime.
More rules to come?
Back in the mists of time, the Online Safety Act started life as the Online Harms Bill. In its initial iteration, it aimed to tackle harmful but lawful online content. These provisions were watered down when the Bill was republished as the Online Safety Bill, largely because of the difficulties of determining what constituted harmful online content, and potential conflicts with free speech.
Many continue to criticise the OSA for failing sufficiently to deal with this kind of online harm and Peter Kyle, Secretary of State for Science, Innovation and Technology, has been treading a politically tricky line between complaining the OSA is an inherited mess (or words to that effect) and also insisting it is fit for purpose. He has, however, suggested additional rules may be appropriate in future.
Ofcom, unsurprisingly, continues to focus on illegal content and content harmful to children within the remits of the OSA, but on 30 June 2025, it launched a consultation on additional ways tech firms should protect people in the UK to ensure online safety. Proposals include stopping illegal content going viral, protecting children when live streaming, and tackling intimate images shared without consent. The new measures build on the illegal harms and children's codes of practice and push platforms to go further. Ofcom proposes:
- Providers incorporate safety by design and prevent illegal material reaching users, including by using hash matching to detect terrorism content and intimate images shared without consent, eg explicit deepfakes. Ofcom also suggests some services should assess the role automated tools can play in detecting content and use them where they are available and effective.
- Sites and apps should prevent people from posting comments or reactions or sending gifts to children's livestreams and prevent people from recording children's livestreams. Providers should already be taking steps to protect children from grooming and be using robust age checks. Companies are expected to ban users who share child exploitation and abuse material.
The consultation on the new proposals is open until 20 October 2025.
On 11 July 2025, the UK Science, Innovation and Technology Committee published a report on social media, misinformation and harmful algorithms, an inquiry commissioned in the wake of the Southport riots. The report suggests the current online safety regime does not go far enough in tackling the spread of misinformation, not least because it was not intended to address algorithmic amplification of this kind of content. It makes a number of recommendations including that:
- Social media platforms should be required to embed tools that identify and deprioritise misleading content likely to cause significant harm
- Platforms should have duties to undertake risk assessments and reporting requirements relating to harmful but legal content
- All online services with content recommender systems should give users the right to reset the data stored by their algorithm
- AI-generated content should automatically be labelled as such with metadata and visible, permanent watermarks
- Ofcom should be given powers to fine platforms which allow monetisation of harmful content on their services.
The government is not required to implement any of these recommendations but they do appear to add fuel to revisiting the OSA. Any changes are, however, unlikely to happen soon and both Ofcom and the government have urged people to wait until the OSA regime is fully up and running before assessing the extent to which it is effective.