6 of 6

18 January 2021

EU and UK Digital policy – 6 of 6 Insights

UK government sets out final approach to regulating online harms

A summary of the issues, developments, and what it means for you.

  • In-depth analysis
More
Author

Debbie Heywood

Senior Counsel – Knowledge

Read More

What's the issue?

In February 2020, the government published its Initial Consultation Response to its Online White Harms Paper published in April 2019. The final report had been expected in the spring but was delayed.

What's the development?

The DCMS published the final response to its consultation on the Online Harms White Paper in December, together with interim codes on terrorism, child sexual exploitation and sexual abuse (CSEA). Central to the proposal is an Online Safety Bill (OSB) set to be published later this year. which will introduce a statutory duty of care on businesses within scope to protect their users. Ofcom will be appointed regulator.

What does this mean for you?

The final response provides more clarity on who will be caught by the OSB and the sorts of harms it will tackle. While there is more detail on how harms will be classified, the problem of clearly defining what constitutes content which is lawful but harmful remains. We will need to wait for publication of the legislation to know whether or not it is resolved but it seems likely that the approach taken will be risk-based, leaving the service providers with much of the decision-making burden.

Read more

What will be in scope?

The OSB will apply to:

  • services which host user generated content (UGC) which can be accessed by users in the UK; and/or facilitate public or private online interaction between service users, one or more of whom is in the UK
  • search engines

regardless of where the relevant company is based.

This will include social media services, consumer cloud storage sites, video sharing platforms, online forums, dating services, online instant messaging services, peer-to-peer services, video games which enable interaction with other users online and online marketplaces. Only companies with direct control over the content and activity on a service will be subject to a duty of care. Both public communication channels and services where users expect a greater degree of privacy (eg online instant messaging services and closed media groups) will be covered.

What is not in scope?

  • Business to business services.
  • Businesses which play a functional role in enabling online activity, eg ISPs, VPNs, browsers, web-hosting companies, device manufacturers, app stores and security software, however, they will have a duty to cooperate with Ofcom on business disruption measures.
  • Low-risk businesses including services used internally by businesses and those with limited functionality, for example retailers who offer product and services reviews. User comments on digital content provided that they are in relation to content directly published by a service will not be covered. This will include reviews and comments on products and services directly delivered by a company, as well as ‘below the line comments’ on articles and blogs. By way of example, a lower risk service might include features such as: the ability to moderate all content; having public messaging forums with text content only; and taking steps to ensure an age appropriate environment for children, for example by restricting contact with children by unknown users.
  • Content published by a news publisher on its own site and user comments on that content. There will also be protections for journalistic content published on in-scope services.
  • Services managed by educational institutions that are already subject to regulatory or inspection frameworks (or similar processes) that address online harm.
  • Email and telephony providers.

What is meant by UGC and user interaction?

  • UGC means digital content (including text, images and audio) produced, promoted, generated or shared by users of an online service, where such content may be paid-for or free, time-limited or permanent. It must have the potential to be accessed, viewed, consumed or shared by people other than the original producer, promoter, generator or creator.
  • The definition of UGC includes organic and influencer ads which appear on in-scope services, including images or text posted from user accounts to promote a product, service or brand, whether or not these are paid for.
  • User interaction means any public or private online interaction between service users with potential to create and promote UGC, where interaction may be one-to-one or one-to-many and may involve means other than text, images and audio.
  • A user refers to any individual, business or organisation (private or public) that puts content on a third-party online service. Users may be members, subscribers, sellers, customers or visitors depending on the nature of the service and may generate content or interact directly or through an intermediary like an automated tool or bot.

What type of harm will be covered?

  • Content will be harmful where it gives rise to a reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals.
  • There will be a limited number of priority categories of harmful content to be set out in secondary legislation. These are:
    • criminal offences (including child sexual exploitation and abuse, terrorism, hate crime and sale of illegal drugs and weapons)
    • harmful content and activity affecting children, such as pornography or violent content
    • harmful content and activity that is legal when accessed by adults, but which may be harmful to them, such as abuse and content about eating disorders, self-harm or suicide.
  • Some categories will be explicitly excluded to avoid regulatory duplication. They include harm resulting from:
    • breaches of intellectual property rights
    • breaches of data protection legislation
    • fraud
    • consumer protection law
    • cybersecurity breaches or hacking.

      It is currently unclear whether product safety will be in scope.
  • The OSB may take into account any final recommendations by the Law Commission on communications offences including cyber-flashing and 'pile-on' harassment.
  • The OSB will not cover harm occurring through the dark web.
  • Harms to organisations will not be in scope of the regime. This would suggest that defamation or malicious falsehood against corporates/organisations will also be out of scope. 

Duty of care:

Companies in scope will have a duty of care to their users to prevent UGC content or activity on their services causing significant physical or psychological harm to individuals. They will be required to prevent proliferation of illegal content and activity online and ensure children using their services are not exposed to harmful content. To meet the duty of care, companies in scope must understand risk of harm and put in place appropriate systems and processes to improve user safety in accordance with the following guiding principles (which are also to be applied by Ofcom when deciding on enforcement action).

  • Improving user safety – taking a risk-based approach that considers harm to individuals.
  • Protecting children – requiring higher levels of protection for services used by children.
  • Transparency and accountability – increasing user awareness about incidence of and response to harms.
  • Pro innovation – supporting innovation and reducing the burden on business.
  • Proportionality – acting in proportion to the severity of harm and resources available.
  • Protection of users' rights online – including freedom of expression and right to privacy.
  • Systems and processes – taking a system and processes approach rather than focusing on individual pieces of content. Will include user tools, content moderation and recommendation procedures and safety by design.
  • Businesses within scope should complete an assessment of the risks associated with their services and take reasonable steps to reduce the risks of harms they have identified occurring.
  • Steps taken as a result will depend on the risk and severity of harm occurring, the number, age and profile of their users and the company’s size.
  • All companies in scope will have a specific legal duty to have effective and accessible reporting and redress mechanisms.

A tiered approach

Expectations of companies will differ in relation to different categories of content and activity according to: whether it is illegal; harmful to children; or legal when accessed by adults but nonetheless harmful to them (which includes disinformation and misinformation which could cause significant harm). Businesses will be classed as either category 1 or 2 with the majority in category 2.

All businesses will be required to take proportionate steps to address illegal content and activity and to protect children from harmful content.

General steps required will depend on the risk and severity of harm occurring, the number, age and profile of users, and the company's size. Search engines will need to assess the risk of harm occurring across their entire service. Businesses will also have to provide user reporting mechanisms and an appeal procedure against content take down.

Only category 1 businesses will be required to take steps In relation to content or activity which is legal but harmful to adults. Category 1 businesses are likely to include Facebook, Twitter, TikTok and Instagram. They will be required to:

  • Take proactive steps to address illegal content.
  • Provide extra protections for children.
  • Self-assess risks of lawful but harmful content.
  • Submit transparency reports to Ofcom on how they are addressing online harms.

Content and activity which is legal but harmful to children

All companies in scope will be required to assess the likelihood of children accessing their services. Only services likely to be accessed by children will be required to provide additional protections for children using them. This will be in line with the ICO's Children's Code.

Additional protections will include carrying out a child safety risk assessment and identifying and implementing proportionate mitigations including age-appropriate protective measures.

Enforcement:

Ofcom will be the regulator and its regulation will supposedly be paid for by firms within scope which are above an as yet unspecified threshold based on global annual revenue although the government says most businesses will not be required to pay a fee.

Ofcom will establish a super complaints mechanism (for representative organisations) and an advocacy mechanism. The government will establish a statutory appeals procedure.

Ofcom will be expected to take a proportionate risk-based approach but will have a range of enforcement powers to include:

  • Fines up to £18M or 10% of global turnover, whichever is higher, for failing to meet the duty of care.
  • Business disruption measures against any in-scope business providing services to UK users including directions to withdraw services or, for egregious failures, blocking services from being accessed in the UK.
  • Power to issue directions and notices of non-compliance.
  • The OSB will not initially provide for criminal sanctions against senior management but the government may introduce these at a later stage if necessary.

Codes of practice

Codes of practice will be introduced setting out in more detail what systems and processes companies need to introduce to fulfil their statutory duty of care. Companies will be required to comply with the codes or demonstrate that any alternative approach they take is equally effective. The government has published interim codes on terrorism and CSEA although these are currently voluntary and non-binding.

Return to

home

Go to Interface main hub