12 February 2020
– 1 of 5 Insights
Government prioritises freedom of speech and protection of children in its initial consultation response to the Online Harms White Paper.
The government published its Online Harms White Paper in April 2019. The White Paper covered plans to tackle a wide range of online harms from illegal to potentially harmful content and behaviour. Its main proposal was the introduction of a statutory duty of care to make companies responsible for the safety of their users and for tackling harmful content. Compliance was to be overseen and enforced by an independent regulator who would be given powers to issue fines, find senior management liable, and in some circumstances, to require site blocking.
The White Paper received a mixed response and free speech campaigners were quick to criticise it as overly restrictive and impossible to implement. You can read our views here.
The government has published its Initial Consultation Response to the White Paper. While there is little new, there has been a shift in emphasis, particularly in relation to the protection of freedom of speech which is now at the forefront of the proposals, and a more noticeable emphasis on the protection of children. The proposals place the responsibility very much on businesses to develop their own approaches to potentially harmful (as opposed to illegal) online content, rather than defining the scope of UGC to be covered. Whether or not private communications (for example, in closed community groups) will be covered remains unclear.
The government does not appear to be looking at introducing separate codes of practice for each type of online harm. It is, however, proposing a nuanced approach to transparency and reporting obligations depending on the size and nature of the business. More urgent attention will be given to tackling online terrorist and Child Sexual Exploitation and Abuse (CSEA) content and activity, with interim voluntary codes of practice to be published in the coming months.
Legislation on online harms will only apply to companies that provide services or use functionality on their websites which facilitates the sharing of user generated content or user interactions (eg comments, forums or video sharing). To be in scope, a business will have to operate its own website with the functionality to enable sharing of UGC or user interactions. Business to business services will not be covered.
Ofcom is proposed as the regulator although the scope of its enforcement powers is yet to be decided.
If you are caught by the new regulation you are unlikely to be forced to remove specific pieces of legal content but you can expect more extensive and specific obligations around tackling illegal content. This will need to be removed expeditiously and you will need to minimise the risk of it appearing by having effective systems.
You will most likely be required to develop and implement clear, transparent and easily accessible policies around what content and behaviour will be acceptable on your sites and to enforce policies consistently and transparently. Effective transparency reporting will be required to help ensure your content removal is well-founded and preserves freedom of expression.
Particularly robust action will be required to tackle terrorist and CSEA content. You will also need to ensure a higher level of protection for children and take reasonable steps to protect them from inappropriate or harmful content. You will be expected to use a proportionate range of tools including age assurance and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms. In addition, you will be required to have effective and proportionate user address mechanisms as the regulator will not investigate or adjudicate on individual complaints.
There will be differentiated expectations around illegal content and activity as compared with potentially harmful activity, and there will also be a difference in approach to transparency reporting requirements depending on what type of business you are in relation to the type of service being provided and risk factors involved.
A final report on the consultation will be published in spring 2020, so legislation is some way off and requirements may yet change, but two interim and voluntary codes of practice on tackling online terrorist content and CSEA content will be published shortly and are intended to bridge the gap until the new regulator becomes operational. The government will also be publishing its first annual transparency report in the next few months and a media literacy strategy in summer 2020.
Even if legislation is not imminent, "the government expects companies to take action now to tackle harmful content or activity on their services".
12 February 2020
by Multiple authors
17 February 2020
by Multiple authors
17 February 2020
by Multiple authors
Long-awaited direct marketing Code of Practice published for consultation.
17 February 2020
by Multiple authors
eGaming and gambling addiction under the spotlight.
17 February 2020
by Multiple authors
by Debbie Heywood and Daniel Hirschfield