Autoren
Jo Joyce

Jo Joyce

Senior Counsel

Read More
Alex Walton

Alex Walton

Associate

Read More
Autoren
Jo Joyce

Jo Joyce

Senior Counsel

Read More
Alex Walton

Alex Walton

Associate

Read More

30. Juli 2021

Download – Online Safety Bill – 4 von 6 Insights

The regulation of child safety online – an update

  • In-depth analysis

Digital regulation within the UK has advanced significantly in 2021 as policy makers attempt to address the issues caused by the inexorable growth of the online world. Concerns that children are at particular risk are now being met in the UK with policy and legislative proposals to ensure their safety online.

The Online Safety Bill (OSB), once in force, will impose a statutory duty of care on providers of certain user-to-user and search services to protect their users from harm. All organisations within scope have duties and obligations specific to child users to mitigate the risk of damage to them online. 

The Department for Digital, Culture, Media and Sport (DCMS) recently published a collection of guidance to help businesses protect children on their online platforms. Although it is not specifically tied to any legislative obligations and is targeted at small and medium-sized enterprises, it provides high level guidance around data protection, age appropriate consent, and protecting children from child sexual exploitation and abuse (CSEA) and terrorism content.

Organisations in scope also need to comply with the Children's Code, a statutory code created by the Information Commissioner's Office (or ICO, the UK's data protection authority). The Children's Code, also known as the Age Appropriate Design Code, applies now and the ICO will begin enforcing it from 2 September 2021. The Code cannot be enforced alone, however, the ICO will rely upon it when considering possible breaches of the UK GDPR regarding children's data in the context of digital services. 

The Children's Code introduces 15 standards to be observed by businesses involved in the processing of children's data for relevant services (you can read more about it here and here).

Although they are devoted to addressing different aspects of child safety online, there are clear synergies between the broad obligations created by both the Children's Code and the OSB, indicating that the direction of travel, in the UK at least, favours an increasingly hands-on approach. 

The Online Safety Bill

Who is a child?

For both the OSB and the Children's Code, a child is a person under the age of 18. 

Do children use your services?

Providers will be within scope for services which are "likely to be accessed by children". This is the same language used in the Children's Code – it covers not only services targeted at children, but also those likely to be accessed by them. In practice this is likely to be interpreted broadly and ICO guidance suggests that "likely" means "more probable than not" and a similar interpretation will likely apply in the context of the OSB.

There is a duty on providers to effectively carry out a scoping exercise to establish whether children use the service. This assessment should determine:

  • whether it is possible for children to access the service (or any part of it), and
  • if such access is possible, whether the child user condition is met.

The child user condition will be met if there are a significant number of children who are users of the service, or the service is likely to attract a significant number of child users. 

If the assessment determines that there are child users then this condition is clearly satisfied. Even if there are no child users identified at any given point, aspects of a service which may interest children could mean the result is the same. 

What are the child-specific duties?

There are a number of obligations specific to children for services within scope of the OSB (see here for our coverage of the obligations more generally).

In particular, there is an overarching duty to protect children's online safety. This mirrors the approach taken in the Children's Code which sets up the first of the 15 standards as an obligation to "always act in the best interests of the child". The overarching duty in the OSB is broken down into several sub-duties:

Take proportionate steps to mitigate and manage risks of harm, and the impacts of such harms, to children in different age groups

This distinguishes children in general from children in different age groups. ICO guidance on the Children's Code splits out different age groups up to the age of 17, and it seems likely that there is expectation that organisations should take a similar approach under the OSB. Types of harm, and the impacts of such harms, will clearly differ among children of different ages. For example, content of a sexual nature may be highly harmful to young children but less harmful to older teenagers. 

Operate a service using proportionate systems and processes

This entails:

  • preventing children of any age from encountering primary priority content that is harmful to children, and
  • protecting children in age groups judged to be at risk of particular harm from a type of content from encountering such content.

"Primary priority content" (and "priority content") means content which is designated as such in secondary legislation. This means that there will be categories of content which should be treated as de facto harmful by service providers, though what these are is not yet clear. 

There is also a qualifier of "proportionate". This means that while the OSB asks a lot of organisations within scope, it is unlikely that OSB's enforcer, Ofcom, will expect all organisations of every type to take every step available to them to prevent exposure of children to harmful content. It asks organisations to adopt a risk-based approach.

Specify in terms of service

Service providers need to specify:

  • how children are to be prevented from encountering primary priority content that is harmful to children
  • how children in age groups judged to be at risk of priority content that is harmful to children are to be protected from encountering such content where they are not prevented from doing so, and
  • how children in age groups judged to be at risk of harm from non-designated harmful content are to be protected from encountering such content where they are not prevented from doing so.

Ensure that terms of service are clear and accessible and that they are applied consistently

Alongside the Children's Code and its requirements for clarity and transparency with children about their personal data and privacy, it appears that the OSB will also require organisations to have terms of service which are comprehensible for children if children are likely to engage with their services. The OSB seeks to empower children to report and challenge content they encounter which is harmful, in order to prevent others encountering it too. Action they can take should be conveyed in plain English, in a child-friendly format. 

What content is harmful to children?

This is defined as content which:

  • is designated as content harmful to children by secondary legislation (see above)
  • the provider of the service has reasonable grounds to believe carries a material risk of having directly or indirectly, a significant adverse physical or psychological impact on a child of ordinary sensibilities, or
  • the provider of the service has reasonable grounds to believe there is a material risk of the content's dissemination having a significant adverse physical or psychological impact on a child of ordinary sensibilities, taking into account how many users may be assumed to encounter the content by means of the service and how quickly and widely it may be disseminated via the service.

As with most categorisations in the OSB, the treatment of harmful content is broad. It is unclear what is meant by a child of "ordinary sensibilities", or what "significant adverse physical or psychological impact" may look like. However, the UK government has published Interim Codes of Practice concerning CSEA and terrorist content. The examples provided demonstrate clear risks to both the physical and psychological wellbeing of children, but borderline cases may prove difficult to assess. 

What does a children's risk assessment look like?

Another key duty falling on to service providers will be to carry out children's risk assessments, to keep them up to date with Ofcom's designations of risk and carry out further risk assessments before making significant changes to design or operation of the service. Wrapped into this is the requirement to notify Ofcom of any non-designated content harmful to children which is identified during such assessment.

The children's risk assessment must identify, assess and understand:

  • the service's user base, including numbers of children in different age groups
  • the level of risk of children encountering primary priority content, priority content and non-designated content that is harmful to children, giving separate consideration to children in different age groups and taking into account algorithms
  • the level of risk of functionality of the service facilitating the presence or dissemination of content harmful to children
  • the different ways the service is used and the impacts on level of risk of harm to children
  • the nature and severity of harm that might be suffered by children in different age groups
  • how the design and operation of the service may reduce or increase risks identified.

It's clear that there will be no 'one size fits all' approach to this. The assessment should separately consider different risks to different age groups, how children interact with the service, and the types and severity of harm. 

It is advisable for organisations to begin thinking about risks to children now. Assessment exercises are likely to be resource-intensive, especially for organisations which explicitly target children, or which know that, whether targeting or not, significant numbers of children use their services. 

In particular, the OSB requires the assessment to consider how any algorithms may affect the harmful content encountered by children. The algorithms of large technology companies are often their greatest assets; the OSB will ask these companies to stop and consider how their algorithms impact children. 

The DCMS "one stop shop" for businesses on child online safety

Data protection and privacy

This guidance largely refers out to general data protection obligations under law and the Children's Code and the ancillary information provided by the ICO. It sets out the following summary:

  • a children's data protection impact assessment should be carried out
  • settings must be "high privacy" by default
  • only the minimum amount of personal data should be collected and retained
  • children's data should not be shared unless you can demonstrate a compelling reason to do so, and
  • geolocation services should be switched off by default.

Age-appropriate content

This guidance sets out how businesses should design their services to ensure content is suitable for children using the services:

  • decide what content is acceptable and tell users about this
  • be clear on minimum age limits and take proactive steps to understand if those under the age limit are using the services
  • offer easy-to-use reporting mechanisms for children
  • consider special protections for accounts opened by children
  • plan and regularly update how you will manage inappropriate content on your site
  • consider implementing safety technology tools
  • consider advertising as content on your site and know how it is regulated, and
  • consider providing age ratings on your content.

There is additional guidance for services used by under 13s. 

Positive user interactions and harmful conduct

This guidance sets out how businesses should establish a safe environment for children to interact with each other, including by direct messaging, voice or video calls, commenting, reacting, liking, tagging and playing games. It sets out the following summary:

  • create rules in child-friendly language setting out acceptable behaviours
  • consider special protections for accounts opened by children
  • enable users to block other users, limit information shared and control their interactions with others
  • regularly review moderation, reporting and takedown processes to ensure they are effective in tackling abusive content
  • ensure your reporting system is accessible and easy
  • establish a clear process for tackling abuse, and
  • if you operate a social media platform, adhere to the four principles of the government's voluntary Social Media Code of Practice.

Protect children from online CSEA

In line with the Interim Code of Practice on CSEA, the government is particularly concerned about online CSEA. Businesses are advised to report all forms of CSEA as soon as possible to their local police force.

So, what should you do next?

There is an irony that guidance and regulatory requirements which emphasise the importance of clarity and accessibility for child users are themselves complicated and challenging to assess. In most areas of regulation, it is clear who is subject to the regulator's oversight; that is not the case in the context of online harms. Many organisations may be apprehensive about preparing for the OSB and the Children's Code, but even if compliance is likely to take a long time, it is much better to have a clear understanding of the level of exposure and the work needed to manage it.

Organisations should first assess the applicability of the OSB and the Children's Code to their operations and once done need to put a plan in place to ensure they can achieve compliance and mitigate any unavoidable risks. If, following an assessment, you are still unsure whether your organisation falls under the scope of the OSB and the Children's Code it would be wise to err on the side of caution and assume that it does – the regulators are likely to assert their authority as widely as possible and it is better to be prepared to meet this than to seek to avoid it.

Find out more

To discuss the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Communications team.

In dieser Serie

Technology, Media & Communications

Online Safety Bill – are you caught?

Briefing

von Louise Popple

Technology, Media & Communications

Online Safety Bill – illegal and harmful content and safety duties

In-depth analysis

von Xuyang Zhu

Technology, Media & Communications

Risk assessments under the Online Safety Bill

Quick read

von Mark Owen

Technology, Media & Communications

The regulation of child safety online – an update

In-depth analysis

von Jo Joyce, Alex Walton

Call To Action Arrow Image

Newsletter-Anmeldung

Wählen Sie aus unserem Angebot Ihre Interessen aus!

Jetzt abonnieren
Jetzt abonnieren