Children and young people are increasingly likely to consume online content like user-generated videos rather than watch TV. Harmful content and hate speech have become particularly problematic in this context. While there are unanswered questions about what a VSP is with guidance expected soon, some services regularly used by children like Facebook and YouTube will come within the remit of the AVMSD. Member States (and the UK which, for the purposes of the article, we will count as a Member State) have until September 2020 to implement the updated AVMSD provisions to safeguard children from harmful content.
How does the AVMSD protect children?
VSPs will need to take appropriate measures to ensure they protect minors from programmes, user-generated videos and commercial communications that may impair their development. More generally, the AVMSD requires Member States to ensure that media service providers do not display content that may impair the physical, mental or moral development of minors in such a way that minors are likely to hear or see. Potential viewers also need to be given sufficient information about content for parents and minors to make informed decisions about it.
This is obviously easier to implement where the service provider controls the environment and when (eg by selecting the time of the broadcast), how (eg after an acoustic warning) and in what form the viewers look at what content. Over-the-top services bypass such traditional controls and this makes it correspondingly difficult to implement the new requirements. Whatever the challenges, the measures must be proportionate relative to how harmful the content is, with the most harmful content (like gratuitous violence) being subject to the strictest measures (like encryption and effective parental controls). Solutions may range from systematic content descriptors (or other means of description) to visual symbols and age verification tools, transparent and user-friendly content reporting mechanisms, as well as parental control systems.
Because child protection mechanisms will inevitably lead service providers to process personal data of children, the AVMSD also mandates this data must not be used for commercial purposes, like direct marketing, profiling or behaviourally targeted advertising. This aligns with the GDPR which already recognises that processing children's personal data merits specific protection, especially when used for marketing or creating personality/user profiles. It also aligns with the purpose limitation principle (Article 5(1)(b)) which provides that data processed for a specific purpose, like restricting access to age-inappropriate content, cannot be used for another purpose incompatible with the original purpose, like using it commercially (read more about the GDPR and children here).
Any commercial communications should not be detrimental to minors, either physically, mentally or morally – for example, they should not exploit the minors' inexperience or credulity, exploit the special trust minors place in parents or teachers or unreasonably portray minors in dangerous situations. For on-demand audiovisual media services, any communications about alcoholic beverages should not be specifically aimed at minors or show minors consuming them, among the other requirements from Article 22 that already apply to television advertising and teleshopping. Instead, responsible drinking messages should be conveyed to children and minors. Children's exposure to food and drinks high in salt, sugars, fat, saturated fats and trans-fatty acids should also be effectively reduced and minors should be protected from gambling promotions.
Additionally, product placement should not be allowed in children's programmes, especially as this can affect children's behaviour because they are not able to distinguish its commercial nature. Teleshopping will also be prohibited during children's programmes and Member States may further prohibit the sponsorship of children's programmes.
On top of this, there is an overarching fundamental rights angle – the AVMSD 2010 predecessor sought to promote the rights of the child from the EU Charter of Fundamental Rights. Fundamental rights should be carefully balanced now, especially when adopting measures to protect minors from harmful content.
How will the UK implement the child protection measures in the AVMSD?
The UK Government will amend section 368E of the Communications Act 2003 to require the provision of sufficient information to viewers in the context of on-demand programme services (ODPSs). Section 368E(5)(c) (which relates to specially restricted material available on ODSPs) will also be amended from regulating material that "might seriously impair" the physical, mental or moral development of under-18s to material that "may impair" them. We also know that the government will not mandate the use of a standardised system of content descriptors.
How does this fit in with other initiatives to protect children online?
Online harms
Originally, the government had planned to implement most of the AVMSD VSP requirements as part of a wider online harms regulatory framework (under the planned "duty of care"), but because that framework will not be ready by the AVSMD implementation deadline of 20 September 2020, interim minimal implementation of the AVMSD will be actioned in the meantime, with Ofcom becoming, at least temporarily, the national regulatory authority (read more on online harms proposals here).
Advertising
The UK stakeholders responding to the government's AVMSD implementation consultation broadly supported the proposal that the Advertising Standards Authority be a co-regulator for commercial communications (there is already an Ofcom/ASA co-regulation scheme for video-on-demand). Co-regulation and self-regulation is actively encouraged by the AVMSD, so the government will extend the existing ODPS co-regulatory system to VSPs as needed to minimally implement the AVMSD requirements. The government has also called for evidence on regulating online advertising. One of the questions asks how effectively the current regulatory system prevents the exploitation of vulnerable people (see here for more).
Data protection
The ICO's Age Appropriate Design Code provides guidance on privacy standards for children and looks at measures like self-declarations, AI, third party age verification services, account holder confirmations, technical measures and hard identifiers. The government has confirmed that the Code is consistent with the AVMSD requirements (see here for more).
What's the end result?
The AVMSD mandates true regulatory independence from the government and industry to ensure that the best interests of viewers (including children) are upheld. On top of this, the European Regulators Group for Audiovisual Media Services will be able to give technical advice on the protection of minors and the content of the commercial communications for foods high in fat, salt or sodium and sugars.
There is no doubt that the regulatory framework for protecting children is complex but while it may be impossible to completely guarantee that children will always be protected in an ever-changing online environment, the AVMSD sets out to provide important protections. How successful these prove remains to be seen.
If you have any questions on this article please contact us.