On 14 July 2025, the European Commission published its guidelines on the protection of minors under the DSA to ensure a safe online experience for minors (under 18).
What's the issue?
Under Article 28 Digital Services Act (DSA), online platforms accessible to minors are required to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service. Platform providers must also not show advertisements based on profiling where they are aware with reasonable certainty that the recipient of the services is a minor. Article 28 also gives the Commission scope to issue guidelines to help online providers comply with their Article 28 obligations.
What's the development?
The newly published guidelines set out a non-exhaustive list of "proportionate and appropriate" measures to protect minors from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.
Like the DSA, the guidelines adopt a risk-based approach, recognising that online platforms may pose different types of risks to minors, depending on their nature, size, purpose, and user base. The guidelines enshrine a safety and privacy by design approach and are grounded in children’s rights. Measures taken should not disproportionately or unduly restrict children’s rights. The guidelines are applicable to all online platforms accessible to minors aside from small and micro enterprises.
The guidelines centre around four main general principles which should be considered holistically:
- proportionality and appropriateness: a balance needs to be struck between the measures taken and minors' fundamental rights
- protection of children's rights: all rights should be considered and measures should not discriminate based on any grounds
- privacy-, safety- and security-by-design: high standards of privacy, safety and security need to be integrated into the design of products
- age appropriate design: consider and align with the developmental, cognitive and emotional needs of minors.
Risks should be assessed by reference to the '5Cs':
- content risks: (exposure to potentially harmful, harmful, hateful or illegal content or disinformation)
- conduct risks: (behaviours minors may actively adopt online which can pose risks both to themselves and others)
- contact risks: harmful interactions with others, including cyberbullying, grooming or illegal conduct
- consumer risks: via advertising, profiling, financial pressures and fraud
- cross-cutting risks: risks which cut across all risk categories which may significantly affect minors' lives, including advanced technology risks, health and wellbeing risks, and additional privacy and data protection risks which might lead to predators locating and approaching minors.
Key recommendations
These include:
- setting minors' accounts to private by default so personal information, data, and social media content is hidden
- modifying platforms' recommender systems to lower the risk of minors encountering harmful content, and empowering minors to have more control of their feeds
- empowering children to be able to block and mute any user
- prohibiting accounts from downloading or taking screenshots of content posted by minors
- disabling by default features that contribute to excessive use, like communication "streaks", ephemeral content, "read receipts", autoplay, or push notifications, or design features aimed at engagement
- safeguards around AI chatbots integrated into online platforms
- ensuring that minors' lack of commercial literacy is not exploited and they are not exposed to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours
- introducing measures to improve moderation and reporting tools, requiring prompt feedback
- requirements for parental control tools.
Age assurance and verification
The guidelines recommend the use of effective age assurance methods that are accurate, reliable, robust, non-intrusive, and non-discriminatory. In particular, the guidelines recommend age verification methods to restrict access to adult content such as pornography and gambling, or when national rules set a minimum age to access certain services. In other cases, the guidelines recommend age estimation, eg when terms and conditions prescribe a minimum age above 18 due to identified risks to minors. Age estimation methods should be provided by independent third parties or through independently audited systems, and ideally a range of methods should be provided. Self-declaration does not constitute appropriate age assurance.
What does this mean for you?
In-scope platforms need, in addition to following the key recommendations above, (among other things) to:
- Conduct comprehensive risk reviews covering likely use by minors and associated risks. These should be conducted annually and when there is a significant change to the services. The results of the review need to be published (excluding sensitive details)
- Identify and put in place additional protective measures
- Develop clear policies on harmful content and content moderation (which should include human review)
- Provide reporting and redress mechanisms and appropriate levels of user control
- Provide clear information to minors using plain, intelligible language
- Comply with transparency and governance requirements and ensure information about age assurance, recommender systems, AI tools, content moderation and other policies is available in a child-friendly, age-appropriate and accessible manner.
For those organisations caught by the UK's Online Safety Act, many of the recommendations in the EC guidelines will be familiar and are in line with measures recommended in Ofcom's Children's Codes of Practice, which include:
- Safer feeds – recommender systems posing a medium or high risk of harmful content must ensure algorithms are configured to filter out harmful content from children's feeds.
- Effective age checks – the highest-risk services must use highly effective age assurance to prevent children from accessing harmful material while allowing adults to access legal content. If services are not doing this, they must assume younger children are on their service and have age-appropriate content.
- Fast action – all sites and apps must have review and takedown measures in place to enable them to take action on becoming aware of harmful content.
- More choice and support for children – children must be given more control over their online experience including by allowing them to indicate what content they don't like, block and mute accounts and disable comments on their posts. There must also be supportive content for children who may have encountered or searched for harmful content.
- Easier reporting and complaints – a straightforward way to report content or make complaints must be provided and service providers must respond appropriately. Terms of service must be clear and understandable by children.
- Strong governance – all services must have a named person accountable for children's safety and a senior body should annually review the management of risk.
The guidelines are not legally binding but the Commission will use them to assess compliance with Article 28(1) DSA. They will serve as a reference point for checking whether online platforms that allow minors to use them meet the necessary standards and may inform national regulator enforcement action, although following the guidelines does not guarantee compliance.