The protection of minors online has become a key regulatory priority across Europe, with the EU Digital Services Act (DSA) establishing a framework complemented by distinct national legislation in Germany, France, Italy and Spain. In addition, the UK has developed a comprehensive regulatory framework, not least with its flagship Online Safety Act. Are these frameworks doing what legislators intended and what more can we expect in terms of additional regulation and enforcement in 2026?
DSA headlines will be dominated by enforcement action
During 2025, we saw the European Commission begin to ramp up DSA enforcement. This included opening formal proceedings against porn providers relating to their age verification practices and failures to protect minors from pornographic content; finding breaches of Article 34 risk mitigation requirements; and preliminary findings that Meta's Facebook and Instagram failed to provide users with simple mechanisms to notify illegal content and to enable effective challenges to content moderation decisions.
The Commission issued a range of requests for information towards the end of the year. These were targeted at big online platforms as part of its first enforcement actions. Most recently, in September 2025, it sent requests for information (including to Apple re the AppStore and Microsoft re Bing) asking how they identify and mitigate risks relating to financial scams risks under the DSA. In October, the Commission sent out requests for information to social media companies and other providers (including Snapchat and Apple) regarding safeguarding steps to protect minors.
We expect not only the conclusion of ongoing investigations during 2026, but that the regulators will expand their enforcement focus which is most likely to focus heavily on age assurance and age verification. On 14 July 2025, the European Commission published guidelines on the protection of minors under the Digital Services Act. They apply to all online platforms accessible to minors with the exception of small and micro enterprises (see more). As we discuss here, the Commission's focus on enforcement of child protection provisions will reportedly be extended beyond big tech, to smaller online platforms, with the European Board for Digital Services coordinating with competent authorities. This makes conducting risk assessments, embedding safety by design, and clear documentation important for all businesses in scope of Article 28 requirements.
Regarding age assurance and age verification, in 2025, the EU Commission released a prototype of an EU-wide age verification app, which allows users to prove they are over 18 when accessing restricted adult content. It is currently being piloted by Denmark, France, Greece, Italy, and Spain in lieu of publishing customised national age verification apps and we expect to see further take up in 2026.
VLOPs and VLOSEs may get new regulators
The Digital Omnibus proposal published on 19 November 2025, proposed moving oversight for AI embedded in DSA Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs), to the Commission's AI Office. The AI Office will be granted exclusive supervisory power over AI systems embedded in or qualifying as VLOPs and VLOSEs.
The Omnibus proposal wants to centralise authority to supervise the VLOPs and VLOSEs, giving them a predictable 'one-stop-shop' (the AI Office) for their most critical cross-border AI products. However, it would also lead to VLOSEs and VLOPs having to add one more regulator into the mix where they have or are embedded AI systems. It's possible that the Omnibus proposal, which is currently in the very first stage of the legislative process, will not be enacted in 2026. However, given that the AI proposal is separate from the potentially more controversial data proposal, and that it does need to be enacted within six months in order for the postponed implementation dates of parts of the AI Act to apply, we may see at least the AI part go through before August 2026. Read more about the Digital Omnibus.
The EU Digital Fairness Act will come to the party
The long talked about Digital Fairness Act proposal failed to materialise in 2025 and is now scheduled for Q4 2026 as we discuss here. If that remains the position, it won't be passed in 2026, but it is likely to focus discussion on consumer protection issues which are adjacent to online safety including deepfakes, unfair contract terms, and influencer marketing. The Digital Omnibus proposes a one year compliance postponement of Article 50(2) AI Act which is currently set to introduce watermarking requirements on generative AI systems from 1 August 2026. As discussed above, the proposal may not be passed in time for a deferment of the current deadlines, but deepfakes are likely to be a focal point of online safety reform in the EU during 2026 either way.
EU Member States will keep trying to develop their own regimes
Germany
Germany has established a regulatory regime through a regularly changing 'split system' combining federal and state-level protections. Its Federal Youth Protection Act (JuSchG), which was originally only applicable to offline content, has applied in part to so-called telemedia since reforms in 2021. Among other things, the reformed JuSchG now requires providers of film and game platforms to implement age ratings and requires "online platforms" to take so-called "precautionary measures", including by using child-friendly terms and conditions, child-friendly default settings, notice-and-takedown mechanisms for user-generated content, rating systems for audiovisual content, and age verification.
The Interstate Treaty on the Protection of Minors in the Media (JMStV) is a state treaty applicable to so-called telemedia, i.e. online services. Reforms to the JMStV introducing new rules for operating system (OS) and app providers became effective from 1 December 2025. Based on these new rules, OS providers typically used by minors must ensure that their OS includes parental control mechanisms. The impact on other market players may include downstream effects on third-party app stores, age rating requirements for app providers, further rating obligations, and potential de facto censorship concerns.
The practical impact of these new rules varies. While the effective date is 1 December 2025, the regulator will first have to designate in-scope OS providers, and provisions will only apply one year after designation. Further delay is possible through administrative proceedings and litigation.
In a recent ruling, an administrative court denied the blocking orders requested by the media regulator against pornographic website providers based in Cyprus. Notably, the court held that due to recent case law of the Court of Justice of the European Union (CJEU), the provisions of the JMStV - on which the blocking orders were based - violate the primacy of EU law. This contradicts previous German court rulings from 2021-2023 covering similar subject matter which did award blocking orders. So, while the new rules now apply, it will be interesting to see whether they ultimately survive. Regulators may test them as quickly as possible to see whether the courts will refer questions to the CJEU and that could prove decisive for their durability. Read more about Germany's online safety regime here.
France
France's Act No. 2024-449 on age verification (also known as the SREN law - sécurité et régulation de l’espace numérique) law aims to prevent minors from accessing online pornographic content by imposing robust age verification obligations on website operators and relevant service providers. Operators must implement technical measures to verify that users are at least 18 years old before granting access to pornographic material. In October 2024, the French regulator (ARCOM) rolled out standards for age verification and, during 2025, several major platforms chose to voluntarily block or suspend access from France rather than implement these stringent age verification requirements.
In August 2025, ARCOM escalated its enforcement approach and issued formal notice to five EU-based porn sites – similar to the EU Commission’s approach against online platforms - however the regime has been subject to successive legal challenges. So far, challenges to the ARCOM technical framework have been unsuccessful, but there are challenges to the law which are ongoing. Notably, in September 2025, Advocate General Szpunar delivered an Opinion concluding that the SREN law's obligation on online service providers to implement age verification is incompatible with EU law and the country-of-origin principle of the e-Commerce Directive. The CJEU is set to deliver its ruling in early 2026 and it stands to have significant impact not only on the SREN law. Read more about the French online safety regime here.
Italy
Italy has substantially accelerated its efforts to protect minors online by introducing resolutions by its Communications Authority (AGCOM). Similar to the developments in France, it requires age verification for users seeking access to websites and video-sharing platforms offering pornographic content. AGCOM defines the requirements of such age verification systems via Resolution No. 96/25/CONS. This system is ultimately based on a two-step process: identification and authentication of the identified person. A focus for 2026 is likely to be the ongoing debate in Italy on whether to raise the minimum age for social media access. Read more about Italy's online safety regime here.
Spain
In Spain, the protection of minors online has advanced through updates to the General Law on Audiovisual Communication (LGCA), which now sets clear obligations for audiovisual platforms to prevent minors' exposure to harmful content. Building on this, the Draft Organic Law for the protection of minors in digital environments was approved in March 2025 by the Council of Ministers and is now in its final stage of approval. This law will mandate comprehensive age verification measures for accessing sensitive or potentially dangerous online services. It also imposes obligations for manufacturers of digital products with operating systems to provide information on their products about the possible risks for minors. Read more about Spain's online safety regime here.
The UK's Online Safety Act – ongoing implementation with some delays
Ofcom, the regulator of the UK's Online Safety Act, got off to a flying start with enforcing the regime during 2025, although we are yet to see any major fines for non-compliance. The initial focus has been on failure to respond to information requests, and use of age assurance, particularly by Part 5 services. Children's online safety is very much at the forefront of Ofcom's activities, with an active investigation into an online suicide forum, an enforcement programme to protect users from image-based CSAM, and the creation of an age enforcement assurance programme among other initiatives. In September, Ofcom published confirmation of its decision to fine 4Chan £20,000 plus daily penalties for failure to comply with information requests and on 20 November 2025, Ofcom issued its second fine under the OSA (£50,000 against a nudification site for failing to use age checks to protect children from online pornography), and announced two provisional sanctions and investigations into a further 20 online porn sites.
Ofcom CEO, Melanie Dawes, said in October, that large platforms popular with children will be an enforcement focus for Ofcom going forward (see more here and here). We expect to see major investigations, and potentially significant fines relating to failure to protect children online in 2026, however, it's worth noting that 4Chan and others are contesting the UK's jurisdiction to impose fines in the US courts as we discuss here. The outcome could have major implications for OSA enforcement for some (although not all) US companies.
On 12 November 2025, Ofcom published an update on its OSA implementation plans. Of particular note is the delay to categorisation and additional duties on categorised services. As a result of the (unsuccessful) judicial review of the categorisation thresholds brought by Wikimedia, Ofcom has reconsidered its approach. It will now carry out further consultation with prospective categorised services in early 2026 and publish the categorisation register and a consultation on additional duties for categorised services in July 2026. It will publish final policy statements in mid-2027 and the final statement on terms of service guidance in early 2027.
The original intention was to publish a categorisation threshold in summer 2025, and the Secretary of State, Liz Kendall, was decidedly unimpressed with the delays, sending an open letter to Ofcom's CEO to express her displeasure. There is no doubt that this represents a major delay to implementation of one of the most significant elements of the OSA, but a reconsideration by Ofcom may, in the long run, lead to fewer challenges by businesses which feel they have been wrongly categorised.
On 3 November 2025, Ofcom launched a call for evidence on the use and effectiveness of age assurance under the Online Safety Act and on the role of app stores in children’s exposure to harmful content. Ofcom will submit a report on age assurance by the end of July 2026 and a report on app stores by January 2027. It's possible that this will result in app stores being brought within scope of the OSA at some stage although probably not in 2006.
That being said, we've already seen a number of changes to the OSA as a result of new priority offences being added. As public opinion trends to more rather than less regulation of the online world, especially where children are concerned, the government may find itself under considerable political pressure to bolster the OSA by adding further priority offences, or even by bringing other types of content in scope.
The finalisation of the fee regime, expected by the end of this year, will kick off the process for relevant platforms to submit revenue data to Ofcom for the 2026/27 charging year. These will need to be completed within four months of the fee regime coming into force.
The direction of travel on implementation of the OSA is clear thanks to Ofcom's updated implementation timeline and, notwithstanding the delays to categorisation designations and additional safety duties, it's clear that Ofcom means business when it comes to enforcement. 2026 will be another busy year in the UK for online safety regulation.
2026: taking online safety seriously
The regulatory landscape for the protection of minors online continues to evolve rapidly across Europe. In the EU, while the DSA provides a harmonised baseline, national legislatures have developed sophisticated frameworks tailored to their jurisdictions. Germany's dual system and the UK's comprehensive multi-layered approach demonstrate the complexity and seriousness with which European and UK regulators are addressing child safety online. Service providers operating across these jurisdictions must navigate an increasingly demanding compliance environment, not least in light of significant enforcement mechanisms including substantial fines for non-compliance.
For more in-depth articles on different aspects of DSA and OSA compliance, see here.