In December 2023, the Spanish Data Protection Agency – the main supervisory authority in Spain (AEPD in Spanish, SDPA in English) –, issued a practical guide with ten standards focused on the protection of minors regarding their access to inappropriate content on the internet (Decalogue).
Many age verification systems currently widely used on the internet (self-declaration, sharing credentials with the content provider, age estimation by the content provider or the existence of intermediaries between the user and the online service provider) have clear shortcomings, including potentially enabling tracking down minors online, lack of certainty about the their real age, exposure of identity to multiple online intermediaries, large scale profiling, or collection and processing of unnecessary data, among others.
The AEPD's Decalogue aims both to address the growing mental health problems of minors arising from the excessive and uncontrolled use of online content and provide a suitable solution to help tackle the above-mentioned risks. The Decalogue is intended to help online service providers to comply with the GDPR when implementing robust age verification systems.
Among the different standards to consider, online service providers need to ensure that age verification systems do not allow minors to be identified, tracked, or traced. Instead, users should be exclusively treated as either “authorised individuals” or not. This means personal data related to age should not be processed. Consequently, to protect their general interest as vulnerable individuals and avoid potential risks, under no circumstance should the system be based on the user's disclosure of their status as a minor.
Anonymity is crucial for the implementation of age verification systems which entails that online service providers should not process personal data for the purposes of age verification, irrespective of other processing of personal data for which identification is in fact necessary (for example, the purchase of products or services). In this regard, it is important to understand that the obligation to carry out age verification to which some online service providers are subject is not considered by the AEPD as a legal basis for the processing children’s personal data as you should be able to use age verification without it.
The status of “authorised individual” should only be checked when access to inappropriate content is requested. This is relevant in order to comply with the data minimisation principle, given that online service providers do not need to prove that all individuals accessing the services are considered to be “authorised individuals”.
The principles of privacy by design and by default must be adhered to when creating age verification systems for the sake of children’s privacy, bearing in mind their special vulnerability, the potential impact of data breaches and the requirement to use clear and plain language that children can easily understand.
Labelling is key to classifying content as sexual, violent and/or racist, which will help online service providers to categorise content as inappropriate for children. However, adding multiple tags to the content accessed by a user may pose a risk of profiling, especially when such content may reveal sensitive data.
The risk of profiling or systematic monitoring will be considerably reduced If access restriction is executed locally on the users’ device instead of by the online service provider – or even by third parties such as intermediaries. According to the AEPD, protection systems based on user profiling to determine whether online users are minors entail systematic processing of personal data which may ultimately be regarded as excessive or disproportionate.
Collaboration with parents and guardians has proved essential for rigid age verification systems and, consequently, they must be involved both in the determination of the specific content to be regarded as inappropriate and in the establishment of adequate identification mechanisms. This approach is consistent with Article 84 of the Spanish Data Protection Law which sets out provisions for the protection of minors online with the assistance of their parents and/or guardians.
Taking as a basis the accountability principle laid out in Article 5(2) of the GDPR, governance is a crucial part of building compliance with the principles of this Decalogue. The implementation of an accountability framework must necessarily involve parents, child protection associations and researchers, and privacy experts who can understand children’s privacy, avoiding classic systems of self-declaration which might be effective for low-risk processing activities but do not generally offer real protection for minors.
Ultimately, age verification systems must strike a balance between respect for the right to digital education and the use of internet and the right to privacy and free will.
The AEPD has also issued specific video tutorials to provide a proof of concept of appropriate age verification systems at a practical level, depending on the device used to access the internet.
Proof of concept tests are based on the fact that a clear separation between identity management, age verification and content filtering is possible, thus demonstrating that the identity providers currently using self-identification for Spanish and European citizens do not need to build parallel digital systems for the specific purpose of preventing access to unsuitable content by minors.
Access to content from a computer can be managed by an age verification app to be installed on the mobile phone of the users which will certify their identity and issue a QR code as security clearance without providing identifying information (personal data). Then the browser will request verification – by means of the QR code– to the age verification app in order to grant or refuse access to specific content previously labelled as adult content or inappropriate content.
As for access to content from mobile phones, eWallet solutions compatible with the eIDAS regulation work on 'age attribution' systems to provide confirmation of a specific age, complying with the data minimisation principle. In this case, the browser will request verification by checking the age attribution generated by the eWallet temporarily stored in the age verification app.
The AEPD's Decalogue confirms that age verification systems should be developed considering the principles laid out in the GDPR, in particular, data minimisation, privacy by design and default and accountability. Furthermore, anonymity must be considered when designing age verification systems, preventing profiling and helping protect privacy with respect to third-party providers.
From a technical standpoint, third parties should be prevented from acting as intermediaries between the user and the online service provider, for example, by developing tools so that the user's personal device is the one that executes the age verification mechanism, without using any external resource.
The protection of minors from inappropriate content must be also compliant with fundamental rights of individuals and data protection legislation cannot be an obstacle to the development of adequate measures to protect minors. The responsible use of technology that ensures the best interests of minors must be shaped and achieved by promoting collaboration between families and the government.
In this regard, in early January 2024, the Spanish Prime Minister announced that the government will pass a law to protect minors from internet pornography which is expected to encompass all the aspects reflected in the Decalogue and give assurance to stakeholders of data protection compliance.
Victoria Hordern looks at the impact of AI on children, and at the role of AI and data protection legislation in protecting them from potential AI-related harms.
1 of 5 Insights
Megan Lukins looks at recently updated ICO guidance on data protection compliance when using age verification to protect children from harmful online content, particularly in the context of the UK's Online Safety Act.
2 of 5 Insights
Debbie Heywood looks at what the ICO's Children's Code and the Online Safety Act mean by the term "likely to be accessed by children" and at overlaps and differences in requirements.
3 of 5 Insights
Sheppard Mullin's Liisa Thomas and Kathryn Smith look at what businesses operating in the US need to do to comply with current and incoming children's privacy laws.
4 of 5 Insights