The debate around the Digital Services Act (DSA) is shifting. After a strong focus on the large tech companies, attention now appears to be turning to the protection of minors on smaller platforms. This strategic pivot is not based on new legislation but seems to stem from the application of Article 28 DSA, supported by the new EU guidelines and relevant scientific findings. This article examines the legal and technical drivers behind this development and outlines practical steps that online platform operators may wish to consider in order to prepare proactively.
For months, the DSA debate revolved around the regulation of Very Large Online Platforms (VLOPs). However, a strategic shift now appears to be underway: according to the European Commission, enforcement of child protection obligations may be extended to smaller online platforms, as noted in a News Article of 10 October 2025. Reportedly, the European Board for Digital Services, acting through its Working Group on the Protection of Minors, has agreed to coordinate with the competent authorities on measures to ensure compliance with the DSA by smaller platforms. This development does not constitute a change in the legal framework but rather seems to form part of a planned escalation in the DSA’s implementation cycle, which – according to the Commission – rests on a combination of legal, psychological, and technological insights.
Legal Basis and Catalyst for the Shift
The legal obligation for online platforms accessible to minors to ensure a "high level of privacy, safety, and security" is set out in Article 28 DSA and has been in effect since February 17, 2024. Only micro and small enterprises are exempt from this provision (Article 19 DSA).
The trigger for the current enforcement wave appears to have been the publication of the final Guidelines on the Protection of Minors by the European Commission. Although not legally binding, these guidelines are likely to be regarded by authorities as an important benchmark. They were developed following a public consultation and through focus groups involving more than 150 young people.
Strategic Rationale: Addressing Pervasive Risks and Preventing Risk Migration
Supervisory authorities appear to be expanding their focus for several strategic reasons:
Addressing pervasive risks: Dangers such as grooming, cyberbullying or exposure to self-harm content are, according to the EU Kids Online research network, not limited to VLOPs. The network regularly publishes reports intended to substantiate this view.
Preventing risk migration: The Commission has warned that harmful content could migrate from heavily monitored VLOPs to less regulated platforms. After enforcement actions against VLOPs were perceived as a strong market signal, enforcement is now expected to continue in order to ensure a consistent level of child protection across smaller platforms as well.
Psychological and Technological Dimensions of Compliance
The DSA’s strict requirements are understood by the Commission as a response to scientific findings.
The adolescent brain and manipulative design:
A study published by, among others, the Joint Research Centre (JRC) of the European Commission indicates that children are in a critical phase of neural development and may be particularly susceptible to “addictive” design features such as infinite scrolling. The call for “Safety by Design” in the DSA Guidelines appears to represent a regulatory response to these concerns.
A systematization to online dangers:
The risk framework presented by the German Federal Agency for the Protection of Children and Youth in the Media (BzKJ) and psychological studies on topics such as cyberbullying, hate speech, sexting, and cyber grooming appear to support the relevance of the OECD’s “5C” risk typology (Content, Conduct, Contact, Consumer, Cross-cutting) from 2021, which is also reflected in the Guidelines. However, it remains to be seen how risk assessments in line with the 5C typology will be implemented in practice, as platforms are still awaiting model approaches or further guidance from the Commission.
The technical challenge of age verification:
The implementation of Article 28 DSA may create tensions with the GDPR. As a potential solution, the EU promotes privacy-enhancing technologies (PETs). The "Age Verification Blueprint" developed by the Commission is viewed as a reference standard and is intended to be compatible with the forthcoming EU Digital Identity Wallet (EUDIW).
A Coordinated Enforcement Architecture
Outlook and Recommended Next Steps
- Q4 2025 – Q2 2026: Anticipated finalisation of “common instruments” and identification of high-risk platforms.
- From mid-2026: Possible start of the first investigations by national authorities.
- Late 2026 – 2027: First enforcement decisions and potential fines.
Recommendations for Online Platforms
Proactive, systemic risk management is now essential for all online platforms.
Conduct risk assessments: Platforms should proactively carry out documented risk assessments, using the Commission's Guidelines and the "5C" typology as a reference tool.
Embed "Safety by Design": Safety considerations should be integrated early into product development cycles. As this article suggests, this is likely to form a key component of the emerging regulatory philosophy.
Document everything: In the event of an investigation, demonstrating a well-reasoned and documented process will be the best defence.
The ability to demonstrate a thoughtful, proportionate, and well-documented approach to child safety may prove decisive in mitigating regulatory risks.