23 May 2019
Product Protection – 12 of 20 Insights
"Digital technology is overwhelmingly a force for good across the world and we must always champion innovation and change for the better. At the same time, I have been clear that we have to address the Wild West elements of the Internet through legislation, in a way that supports innovation." 1
Matt Hancock, then Secretary of State for Digital, Culture, Media and Sport, made this statement in May 2018 after a government consultation on online safety.
The Online Harms White Paper was published on 8 April 2019.2 It is a response to growing public concern about online material caused by, among other things, the broadcasting of terror attacks, the use of the internet for abuse and bullying, and increased awareness of the potentially damaging impact of the internet on children.
The White Paper promises that the government will establish a new statutory duty of care "to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services." This duty of care will be enforced by an independent regulator with a "suite of powers" to take enforcement action against companies that have breached their new statutory duty.
While the government's aims are laudable, on closer inspection the White Paper is light on detail. A large number of key concepts are yet to be defined and many important decisions have either been postponed until after further consultation or simply delegated to the regulator. The following are just some questions which require answers:
Rather than offering a definition of "harmful" content, the White Paper3 offers a non-exhaustive list of harms which are intended to be caught by the duty of care.
These are separated into three categories, "harms with a clear definition" (including terrorist content and child sexual exploitation), "harms with a less clear definition" (including cyberbullying and trolling") and "underage exposure to legal content" (including children accessing pornography and inappropriate material).
The White Paper explains4 that "the regulatory approach will impose more specific and stringent requirements for those harms which are clearly illegal, than for those harms which may be legal but harmful, depending on the context." It appears from this that companies will be expected to distinguish between these categories.
This will not be an easy task. Defining "illegal" content by reference to the existing criminal law would be a complex exercise due to the inadequacy of some criminal offences at effectively targeting online criminal behaviour.
A Scoping Report undertaken by the Law Commission as part of the review of the law around abusive and offensive online communications highlighted the need for "reform and consolidation of the communication offences, so that they are clearer and more proportionate."5
In relation to identifying harms which are not illegal but still to be considered "harmful", the White Paper offers little useful guidance. Worryingly, it refers on a number of occasions to content which "threatens our way of life in the UK;" by no means an uncontroversial standard.
This has been left to be determined by the regulator. Codes of practice will "outline the systems, procedures, technologies and investment, including in staffing, training and support of human moderators, that companies need to adopt to help demonstrate that they have fulfilled their duty of care to their users."6
These requirements will not be the same for all companies, instead the regulator will "be required to assess the action of companies according to their size and resources, and the age of their users."7
Whilst the White Paper promises that the regulator will be able to issue fines and demand additional information from companies, sanctions for severe breaches are the subject of further consultation. Suggestions range from forcing companies to withdraw their services (a measure reserved for extremely serious breaches), to ISP blocking and liability for individual senior management.
Regulating online content necessarily involves balancing the competing aims of protecting internet users and defending freedom of speech. The White Paper prioritises the former.
A single paragraph headed "Protecting user's rights online"8 promises that the regulator will "ensure that the new regulatory requirements do not lead to a disproportionately risk averse response from companies that unduly limits freedom of expression, including by limiting participation in public debate."
Precisely how the regulator will control the reaction of companies is not spelt out. In light of sanctions possibly including fines which are tied to the volume of views of illegal content, companies are likely to be overzealous in categorizing and removing "harmful" material.
Finally, if users wish to challenge the removal of their posts, what recourse will they have? The White Paper points to judicial review of the regulator by companies and others and seeks views about whether there should be a route of appeal through a tribunal other than the High Court.
The regime proposed by the Online Harms White Paper combines an extremely broad, poorly defined scope of the duty of care with draconian sanctions for non-compliant companies.
The dangers of such an approach are exemplified by Germany's Network Enforcement Act (known as "NetzDG"). NetzDG requires platforms to take down prohibited content within 24 hours of a notification. The scope of "illegal" content ranges from "hate speech" to mere "insult".
The result of this broad definition has been that tweets from both a right-wing politician and a left-wing satirical magazine which made fun of the politician were removed. There has been widespread recognition that the correct balance has not been struck by NetzDG and it is hoped that the UK government refines its plans following further consultation.
3 Online Harms White Paper Section 2, pages 30-40
4 Online Harms White Paper paragraph 3.5
6 Online Harms White Paper paragraph 3.6
7 Online Harms White Paper paragraph 3.4
8 Online Harms White Paper paragraph 5.12
by Multiple authors
by Multiple authors
Wilson v Beko  EWHC 3362 (QB)