As digital platforms increasingly shape how children and adolescents experience the online world, providers face growing scrutiny over how they protect minors from harmful content. This article explores key regulatory developments across three major jurisdictions: the EU’s Article 28 Digital Services Act (DSA) Guidelines, Germany’s Youth Media State Treaty (JMStV), and France’s new age-verification framework. We highlight the main obligations, trends and practical steps to help providers navigate this rapidly evolving compliance landscape. You can also read about the UK's online safety regime here.
EU: Article 28 DSA guidelines on the protection of minors
The European Commission’s new Guidelines under Article 28 DSA clarify what online platforms must do to ensure a “high level of privacy, safety and security” for minors. They apply to any online platform accessible to minors, that is, one that targets or is knowingly used by under-18s, regardless of stated age limits. While non-binding, the guidelines set an enforcement benchmark and call for a coherent, risk-based approach rather than a checklist of technical measures.
At a high level, online platforms must conduct a risk review assessing, among other things, the likelihood of minors accessing the service, the types of risk to minors (using the OECD‘s “5 Cs”: content, contact, conduct, consumer and cross-cutting) and the effectiveness of mitigation steps. Providers must integrate child safety principles by design and by default, prioritising the child’s best interests and developmental needs. For more details on the recently published guidelines, please see here.
With respect to content, the guidelines identify a number of concrete measures that providers should consider when minors may use or access their service. For example:
- Age assurance methods are deemed appropriate in higher-risk scenarios. The guidelines distinguish between age estimation, self-declaration and full age verification. While they hold that self-declaration is generally insufficient to address risks coming from adult-only content (eg pornography, gambling), they expressly endorse age verification methods for such high-risk content. The Commission is developing an EU-wide system for age verification to serve as an interim measure until the EU Digital Identity Wallet launches in 2026, enabling users to confirm 18+ status privately and without data transfer. Further details on the interplay of GDPR and DSA in the context of age assurance methods can be found in this recent article.
- For content design and moderation, minors’ accounts should default to the most protective settings. Features enabling unsolicited contact or excessive use (autoplay, push notifications, streaks) should be disabled, and recommender systems should rely on explicit user input rather than engagement signals to avoid harmful content loops.
- Specific protections around harmful or exploitative content: the guidelines require that online platforms ensure they do not exploit minors’ commercial naivety (eg via loot boxes or manipulative adverts) and apply extra protections to content created by minors, such as limiting downloads or the distribution of potentially sexualised material.
While the guidelines do not provide a one-size-fits-all checklist, they clearly set out what online platforms must do (via their risk review) to align with Article 28’s standard of a “high level” of protection for minors. From a compliance perspective this means reviewing product design, onboarding processes, recommender logic, age assurance approaches, content moderation regimes and transparency/documentation of these measures.
German regime for the protection of minors (JMStV)
The JMStV forms the core of Germany’s online youth protection regime and complements the Youth Protection Act (JuSchG), which governs physical media and public events but also online games and movies (see here for more). Since the 2021 reform, both laws apply harmonised age ratings (0, 6, 12, 16, 18) and recognise the same certification bodies (FSK, USK). The JMStV safeguards minors from harmful online content through content restrictions, technical protection measures, labelling, and advertising rules.
Content restrictions
The JMStV distinguishes between three categories of content:
- absolutely prohibited content, ie material that violates German criminal law (eg violent or extremist material)
- adult-only content, including regular pornography or material clearly harmful to minors, must be restricted to verified adults within closed user groups requiring two-step verification and secure authentication
- age-specific content may only be accessed by users of the appropriate age group, enforced via technical measures.
Technical protection measures
Platforms must use tools such as:
- closed user groups with verified identification and session-based authentication
- technical barriers that effectively block minors’ access
- age-labelling systems using standardised XML files compatible with parental control software, and
- watersheds, restricting 16+ content to 11pm-6am, and 12+ content to 10pm-6 am.
Labelling and advertising
Currently, online audio content does not require age labels, while video labelling applies only where equivalent physical media exist. The upcoming JMStV reform will expand labelling duties to all film original content. Advertising must not exploit minors’ inexperience, encourage purchases, or promote alcohol in a child-targeted manner.
Video-sharing and app store compliance
Video-sharing services must adopt appropriate measures such as age verification, parental control and rating systems, and maintain an effective notice-and-takedown mechanism. In addition, the recent JMStV reform will require designated operating systems commonly used by minors to implement parental control systems based on specific age categories allowing parents/legal guardians to control access to age-inappropriate content. It will also check the age setting with the app age ratings from app stores which risk being blocked if they go over the age setting. This may have an impact on all app/content providers, who must ensure their apps are accurately rated and compatible with these systems. For more details on the reform of the JMStV, please see this article.
France: update on age verification mechanisms
Even before the European Commission issued its Guidelines on the protection of minors under Article 28 DSA, France had adopted two laws requiring certain service providers to implement age verification mechanisms. The interaction of this legislation with EU law is still widely debated with multiple legal proceedings pending.
Law of 7 July 2023
This establishes a digital age of majority and introduces an obligation on online social media service providers to verify the age of any new user creating an account on their services, as well as that of existing users. Access to the social media service must be denied to anyone under the age of 15, unless the provider has obtained the express consent of one of the holders of parental authority over the minor concerned.
Although this law was adopted by the French Parliament, its entry into force was conditional on the adoption of an implementing decree, which has never been issued. As a result, these obligations are currently not enforced in France. This situation stems from the European Commission’s warning that the provisions conflicted with EU law, notably the DSA and Article 3 of the e-Commerce Directive, which establishes the country-of-origin principle. However, political pressure to prohibit access to social media services for minors under 15 remains strong, with President Macron recently reiterating his intention to implement the ban, despite the lack of consensus on this question at EU level.
Law of 21 May 2024 on the Security and Regulation of the Digital Space (SREN)
This requires providers of pornographic video-sharing platforms and publishers of pornographic websites to deploy an age verification system that complies with a technical framework established by the Regulatory Authority for Audiovisual and Digital Communication (ARCOM), with the aim at preventing minors from accessing such content.
This obligation is not entirely new: prior to 2024, the French Criminal Code already prohibited the dissemination of pornographic content when it was likely to be seen or perceived by a minor. The law stated that a simple age declaration by the user was insufficient, thereby requiring, in practice, any person distributing pornographic content to implement an age verification mechanism.
This legal framework has already led to legal challenges, some still pending. In 2023, two Czech-based pornographic website publishers appealed to the Conseil d’État (French Administrative Supreme Court), contesting the age verification requirement as incompatible with EU law and the country-of-origin principle of the e-Commerce Directive. Under this principle, an online service provider is subject to the laws and supervision of the Member State in which it is established with respect to the “coordinated field,” and other Member States may not restrict the free provision of its services by imposing additional obligations. Preliminary questions were referred to the Court of Justice of the European Union to determine, in particular, whether the obligation to implement age verification for online service providers falls within the “coordinated field,” even though it arises from provisions of French criminal law. On 18 September 2025, Advocate General Szpunar delivered his opinion, recommending that the Court answer this question in the affirmative. The CJEU’s decision is expected in the coming months.
In 2024, the SREN law went a step further by:
- requiring that age verification mechanisms comply with minimum technical standards defined by ARCOM (the technical framework was published on 11 October 2024, with a three-month transitional period for operators to ensure compliance), and
- granting ARCOM new enforcement powers, including the ability to issue formal notices and impose sanctions on non-compliant service providers (up to €150,000 or 2% of global annual turnover, and €300,000 or 4% in case of repeated violations). ARCOM may also request the blocking or delisting of non-compliant websites and platforms to providers of intermediary services such as internet service providers, domain name resolution providers, or search engines.
The ARCOM technical framework has been challenged in courts, but so far those actions have been unsuccessful.
Regarding territorial scope, the SREN Law’s age verification obligations apply to providers established in France or outside the European Union. For providers based in another EU Member State, the law requires that the obligations may only be imposed "in accordance with Article 3 of the e-Commerce Directive", after such services have been explicitly designated by ministerial order. Such an order was issued on 26 February 2025, designating 17 pornographic websites and content-sharing platforms. Some providers filed appeals against the order. On 15 July, the Conseil d’État refused the provisional suspension of the order and the obligations therefore remain fully enforceable while the appeal on the merit is pending. Some platforms have since withdrawn from France protesting that the system is ineffective and compromises users’ privacy.
Outlook
Complying with child-protection requirements in the digital sphere remains a moving target. The Commission’s Article 28 DSA Guidelines offer welcome clarity on what “appropriate and proportionate” measures should look like, yet their practical implementation (particularly around age assurance) will take time and technological maturity. In Germany, the updated JMStV adds further complexity by layering national requirements onto the DSA framework (raising questions whether this will even hold up), meaning providers must continue to navigate both EU- and domestic-level rules, as well as the JuSchG for certain services. Similar issues arise for French providers and in other Member States where local law may create a complex compliance framework in conjunction with the DSA.
The broader challenge for providers is consistency: online services cross borders, but regulatory obligations do not. As individual countries both within and outside the EU tighten their regimes through age verification and online safety laws, platforms will need to design global compliance frameworks that accommodate diverging national standards while upholding the shared policy goal, of ensuring that minors can participate safely in the digital environment.