5 / 7


Predictions 2021 – 5 / 7 观点

Online harms: increased responsibility for harmful user generated content in 2021

What to expect from the UK's plans to regulate online harms.

  • Briefing

Adam Rendle


Read More
Alex Walton

Alex Walton


Read More

We expect 2021 to see a very significant development in the UK's attempts to reduce online harms. This is due to the introduction of legislation imposing a "duty of care" on businesses to take more responsibility for the safety of their users in relation to user-generated content (UGC).

It's very hard to predict exactly how this is going to work, given the complexities involved. So instead, let's look at what to expect, some of the issues likely to arise, and how best to prepare.

The legislation is likely to cover a wide range of content including terrorist, child exploitation and abuse (CSEA), cyberbullying, and online disinformation. While social media platforms are most obviously in the crosshairs, the duties will apply to any companies which allow users to share or discover UGC or interact with each other online.

Grappling with frameworks

Warnings that attempts to regulate the internet would restrict freedom of expression online and facilitate censorship have not prevented the introduction of this sort of legislation. The direction of travel at a UK and EU-level is heading towards greater regulation of the internet in most respects. 

For example, the European Commission's proposal for a regulation on preventing the dissemination of terrorist content online includes one-hour takedown requirements on hosting services providers, proactive filtering measures and a duty of care. Meanwhile, the Copyright (Digital Single Market) Directive removes safe harbour protections for online content-sharing service providers and requires them to take additional steps to benefit from a defence to copyright infringements on their services. 

A wide net

The UK government has taken a leading position internationally through its Online Harms White Paper published in April 2019 (OHWP) and subsequent Online Harms White Paper Consultation Response in February 2020 (Consultation Response). 

The government intends to regulate online harms by imposing a wide and relatively unspecific statutory duty of care which will be enforced by an independent regulator (presumed to be Ofcom). 

This duty will apply to businesses that provide services or use functionality on their platforms which facilitate the sharing of UGC or user interactions. The duty isn't restricted to social media platforms and video-streaming services; it also covers websites with comments, forums and video-sharing functions. Clearly, the intended scope is wide. 

What constitutes "online harms" is similarly unspecific. The OHWP definition is: 

"…online content or activity that harms individual users, particularly children, or threatens our way of life in the UK, either by undermining national security, or by reducing trust and undermining our shared rights, responsibilities and opportunities to foster integration." 

In attempting to address concerns over its breadth, the OHWP separates online harms into two categories – illegal content and activities, and content which is legal but harmful. Examples provided suggest that the first batch would cover terrorism or CSEA, whereas the second would cover promotion of self-harm and suicide. 

The proposed regulatory framework encapsulates:

  • A statutory duty of care on businesses within scope to take greater responsibility for UGC on their websites.
  • Codes of practice as developed by the regulator.
  • A specific focus on areas with the most evidence of harm, particularly around children and other vulnerable users.
  • The publication of a Media Literacy Strategy to raise awareness among online users on issues such as privacy and disinformation.
  • The submission of transparency reports to the regulator.

To address concerns around freedom of expression, the OHWP notes that legislation will regulate processes and process management, rather than introducing takedown duties around specific pieces of legal UGC. The OHWP seemingly advocates for a risk-based approach to tackle key risks without prescribing for every eventuality. Several examples are provided:

  • Clear, public-facing terms and conditions which set out acceptable UGC.
  • Consistent enforcement of such terms and a particular focus on protecting children.
  • Robust and transparent content complaints and appeals mechanisms.
  • Expeditious removal of illegal content and systems to prevent its reappearance.
  • A proportionate range of tools to protect children, and age verification mechanisms.
  • It will not be a requirement that adults are prevented from accessing or posting lawful content.

Practical application

Some of these requirements may be practically applied in the following ways:

  • Background processes such as algorithms which make recommendations should not skew results towards extreme or unreliable material and should instead promote a broad range of news sources. 
  • Users should be provided with a suite of tools such as resources on disinformation and where to seek help (eg in respect of self-harm and suicide), permitting blocking and muting, and robust reporting and appeals mechanisms.
  • The prevention of "phoenix accounts" (ie where blocked users are able to reappear). 
  • Providing robust measures for illegal content, such as rapid take-downs and proactive monitoring. 

What are the concerns around the proposals?

Illegal vs legal but harmful content

The proposed list of harms in the OHWP is broad but it remains unclear how regulation will differ between illegal and legal but harmful content. It seems sensible to assume that there will be stricter requirements where the former is concerned, but how strict isn't clear yet. 

While the OHWP provides an indicative list of which types of content would fall into each category, it seems impractical for a regulator to enforce by reference to a list. The OHWP itself notes that harms will vary as society changes so an inflexible enforcement regime may not always be appropriate. 

Types of UGC

There's no proposed definition of UGC but its scope is wide, encompassing comments, forums and reviews. There is a spectrum of types of content from the most established content producers (eg Hollywood studios) to the most amateur (eg an individual uploading their first ever video) and many degrees of 'professionalism' in between.

Common between them is that they are not (typically) produced by or on behalf of the service which allows their sharing. If that's what makes the content UGC, then services will owe these duties in relation to all the content shared on them and it would cover not only YouTube but also Netflix. 

If there is to be some differentiation across this spectrum, then services will owe these new duties only in relation to the UGC and not in relation to the rest of the content, which would mean that the services owe more duties in relation to legal but harmful UGC than they do in relation to their own content which is legal but harmful.

Geographic scope

This will be UK-specific regulation which will theoretically apply across borders to services available in multiple territories but be difficult to implement in practice.

Take online video games with text and voice chat functions: these would fall within scope of the proposals, but how would publishers apply the duty of care when a British gamer could be playing against, for example, a German player for whom there are no such obligations?

Should publishers be developing the game to remove functionality from its UK release which may infringe the legislation? If they didn't, and the regulator enforced against them for the infringing element, how would the regulator's powers function?

There are many questions to be answered on the scope of powers. 

The harmful content vs the remainder of the content

Many types of UGC and services (in which UGC sits) will be nuanced and multi-faceted. For instance, a blog could be published on a service which contains content which isn't classed as UGC. The service would be in scope because of its hosting of the blog but it's not clear whether the duty of care would extend to the entirety of the service.

Collateral damage

There are a number of industries and types of businesses which aren't obviously within the crosshairs, but which are nonetheless captured by the scope of the proposed legislation. 

As noted above, video games are conceptually captured. The proposed requirement to expeditiously remove illegal content may be practically problematic in an environment where the illegal content is contained in text chat in the middle of an online game. 

Small online retailers are also captured. Requirements around empowering users and providing them with a suite of tools to manage their experience online are unlikely to be economical for a specialist Italian food retailer which permits its customers to leave comments on the particular type of pasta they purchased. The application of a proportionate regulatory approach will be key here.

Cloud services providers may also be captured by the current planned scope. There are numerous and compelling reasons why such service providers ring-fence types of data they hold and implement extensive systems to limit access, but that won't necessarily relieve them from compliance. 


The OHWP envisages various types of enforcement actions. The regulator is likely to have a range of typical powers such as serving public notices and fines. However, more dramatic action such as ISP blocking, disruption of business activities and senior management liability are also being discussed, which could have significant impacts on day-to-day operations.

What are the next steps?

Developments are anticipated imminently. In summary:

  • The codes of practice on terrorism and CSEA are expected by the end of 2020 or in early 2021.
  • The final government response to the OHWP consultation is also expected before the end of 2020. 
  • A draft of the legislation is due to follow in spring 2021.

Businesses should begin reviewing their user terms and policies, systems and procedures now, to identify what compliance gaps they may have relative to key elements of the proposals. The requirements around transparency and take-down appeal procedures are, we expect, areas where particular attention will be required. Indeed, the government has specifically said that it expects action now.

Here to help

We're already advising on the implications of and compliance with the new proposals. If you have any questions about the issues raised in this article, please contact us.



前往 Interface主页