The social and technological advances brought about by the digitalisation of our society are driving profound transformations that have a significant impact on the rights, protection, and holistic development of children and youth.
The benefits of digitalisation and democratisation of access to digital environments must go hand in hand with measures to protect children and youth in cyberspace, placing their rights and best interests at the forefront. Recent measures adopted in Spain attempt to do this.
The General Audiovisual Communication Law
The Spanish Law 13/2022 of July 7, the General Audiovisual Communication Law (LGCA), transposes the European Audiovisual Media Services Directive (AVMSD) and establishes obligations for video-sharing platform service providers who create content, as well as for influencers or highly relevant content creators, to ensure the protection of minors. These platforms, such as YouTube or TikTok, provide the public with videos or programs created by users, without the platform assuming editorial responsibility. The LGCA outlines a series of mechanisms designed to safeguard minors.
As an initial measure, platforms are required to implement age verification systems to prevent minors from accessing content that could harm their physical, mental, or moral development. This obligation also extends to commercial communications within the platforms. In particular, access to content such as gratuitous violence or pornography is prohibited, as these materials are deemed particularly harmful and have a significantly greater impact on minors than on other users.
Additionally, platforms are required to provide parental control mechanisms that enable end users to monitor and restrict access to potentially harmful content. To ensure the efficiency of these controls, platforms should implement systems that classify content based on recommended age.
The LGCA also includes a response mechanism to address situations where content has not been adequately restricted. This mechanism, based on reporting and monitoring systems, enables users to effectively report such content, applying to both user-generated videos and commercial communications broadcast on the platforms. This reinforces the monitoring and corrective responsibility of the platforms, which must assess reports and, if necessary, restrict or remove content.
Users of particular relevance, or influencers, are obliged to use the child protection mechanisms provided by the platforms. This includes the requirement to classify their content based on the recommended age, ensuring a safe digital environment that upholds the rights of minors.
The supervision of compliance with these obligations will fall under the responsibility of the National Commission on Markets and Competition (Comisión Nacional de los Mercados y de la Competencia, or CNMC).
Draft Organic Law for the Protection of Minors in Digital Environments
One of the most significant measures in Spain for the protection of minors online is the Draft Organic Law for the Protection of Minors in Digital Environments (Draft Organic Law).
The Draft Organic Law aims to ensure safe digital environments for minors, safeguarding their rights and promoting the responsible use of technology.
The Draft proposes amending parts of the LGCA including by:
- obliging platform service providers to include on their corporate websites a visible and accessible link to the website of the competent audiovisual authority
- requiring the competent audiovisual authority to oversee and ensure the proper functioning of age verification systems, aiming to prevent minors from accessing harmful content
- obliging platforms to set parental controls by default
- aligning the regulatory framework for influencers on platforms with the obligations imposed both on live and on-demand television audiovisual communication services, based on the type of content they publish
- requiring Influencers to comply with the additional obligations set out in articles 99.2 or 99.3 of the LGCA, depending on the content they publish, which, by way of example, include the limitation of time slots for content inappropriate for minors and the prohibition on broadcasting content containing gratuitous violence or pornography
- granting authority to the CNMC, in its role as supervisory entity, to impose, as an ancillary sanction in respect of very serious infringements, the cessation of the provision of services to video-sharing platforms that fail to comply with the obligation to establish and operate effective age verification.
Spanish Data Protection Agency activities
The Spanish Data Protection Agency (AEPD) has been deeply involved in launching initiatives aimed at protecting minors from accessing adult content on the internet, presenting in December 2023 an age verification system for implementation by online platforms designed to protect minors from adult content while safeguarding their right to privacy, intimacy and self-image. This proposal is based on the principles of data minimisation and anonymisation, ensuring that data processing is strictly limited to what is necessary and that age verification does not involve the exposure of unnecessary identifying information.
It is worth noting that, in its proposal, the AEPD developed a Decalogue of Principles, that sets out guidelines that a system for protecting minors from inappropriate content must adhere to.
The proposed principles are described below, together with a summary of the mechanisms proposed for their implementation:
Principle |
Implementation mechanism |
The system for protecting minors from inappropriate content must guarantee that the identification, tracking or location of minors over the internet is impossible. |
In order to comply with this principle, measures should be applied under the criteria of data minimisation and local processing on the device, avoiding centralised databases or intermediaries that could compromise the privacy and security of minors. |
Age verification should be aimed at ensuring that users of the appropriate age prove their they are “authorised to access” and not at verifying the status of a minor. |
In order to comply with this principle, measures should be implemented to avoid the use of tools such as biometric analysis, profiling or obtaining credentials for verification, given that these practices could compromise the security of minors by exposing their information to online platforms or third parties linked to them. |
Accreditation for access to inappropriate content must be anonymous for internet service providers and third parties. |
In order to comply with this principle, verification mechanisms should process the necessary information exclusively on the user's device, ensuring that no personal data is transferred to external servers or intermediaries is used. In addition, the attributes used must be designed in such a way that they cannot be linked to the user's identity or generate traceable browsing patterns. |
The obligation to prove the age of the person “authorised to access” will be limited only to inappropriate content. |
In order to comply with this principle, protection systems should not rely on profiling techniques or continuous monitoring of users, as this would lead to a disproportionate processing of personal data and would violate the principle of data minimisation. |
Age verification must be carried out accurately, and the age categorised as "authorised to access." |
To this end, verification systems should be limited to a binary classification of authorised/unauthorised. Additionally, online platforms should use a verification method that does not rely on estimation, where additional information, such as gender or race, is often required about the individual. |
The system for protecting minors from inappropriate content must ensure that users cannot be profiled based on their browsing. |
To this end, platforms must label content with specific descriptors such as "violent," "explicit sexual content," or "racist" to assess its suitability based on the user's age. Additionally, they must apply the aforementioned binary classification to users. |
The system must guarantee the non-linking of a user´s activity across different services. |
The use of unique identifiers or codes that are reusable across multiple platforms should be avoided. Furthermore, the use of mechanisms that collect metadata, such as geolocation, or other elements that could allow the user to be identified directly or indirectly, should be eliminated. |
The system must guarantee the exercise of parental authority by parents. |
Age verification policies must be established with consideration for families, either directly or through their representatives, associations, and foundations dedicated to the protection of minors. |
Any system for protecting minors from inappropriate content must guarantee all people´s fundamental rights in their internet access. |
Protection systems should be implemented avoiding user identification, profiling based on habits or sensitive data. Likewise, the definition of inappropriate content should be restricted to what is strictly necessary, guaranteeing equitable and barrier-free access for vulnerable groups, respecting the principles of privacy, equality and non-discrimination. |
Any system for protecting minors from inappropriate content must have a defined governance framework. |
Online platforms must establish verification systems that are transparent, auditable, and assessable to minimise risks such as security breaches. In addition, they should integrate with national and European digital identity management systems, offering options adapted to different platforms and social contexts. |
The CNMC will be responsible for assessing the suitability of the age verification systems implemented by the platforms, based on a mandatory report to be issued by the AEPD in accordance with the criteria established in the Decalogue of Principles.
An evolving framework
Like other EU Member States, Spain is enacting child safety measures online in accordance with EU law, but also at its own initiative. Again, similarly to other countries including the UK, regulators – in this case the CNMC and AEPD - will cooperate across the framework to help protect children online.