In February 2020, the government published its Initial Consultation Response to its Online White Harms Paper published in April 2019. The final report had been expected in the spring but was delayed.
The DCMS published the final response to its consultation on the Online Harms White Paper in December, together with interim codes on terrorism, child sexual exploitation and sexual abuse (CSEA). Central to the proposal is an Online Safety Bill (OSB) set to be published later this year. which will introduce a statutory duty of care on businesses within scope to protect their users. Ofcom will be appointed regulator.
The final response provides more clarity on who will be caught by the OSB and the sorts of harms it will tackle. While there is more detail on how harms will be classified, the problem of clearly defining what constitutes content which is lawful but harmful remains. We will need to wait for publication of the legislation to know whether or not it is resolved but it seems likely that the approach taken will be risk-based, leaving the service providers with much of the decision-making burden.
What will be in scope?
The OSB will apply to:
regardless of where the relevant company is based.
This will include social media services, consumer cloud storage sites, video sharing platforms, online forums, dating services, online instant messaging services, peer-to-peer services, video games which enable interaction with other users online and online marketplaces. Only companies with direct control over the content and activity on a service will be subject to a duty of care. Both public communication channels and services where users expect a greater degree of privacy (eg online instant messaging services and closed media groups) will be covered.
What is not in scope?
What is meant by UGC and user interaction?
What type of harm will be covered?
Duty of care:
Companies in scope will have a duty of care to their users to prevent UGC content or activity on their services causing significant physical or psychological harm to individuals. They will be required to prevent proliferation of illegal content and activity online and ensure children using their services are not exposed to harmful content. To meet the duty of care, companies in scope must understand risk of harm and put in place appropriate systems and processes to improve user safety in accordance with the following guiding principles (which are also to be applied by Ofcom when deciding on enforcement action).
A tiered approach
Expectations of companies will differ in relation to different categories of content and activity according to: whether it is illegal; harmful to children; or legal when accessed by adults but nonetheless harmful to them (which includes disinformation and misinformation which could cause significant harm). Businesses will be classed as either category 1 or 2 with the majority in category 2.
All businesses will be required to take proportionate steps to address illegal content and activity and to protect children from harmful content.
General steps required will depend on the risk and severity of harm occurring, the number, age and profile of users, and the company's size. Search engines will need to assess the risk of harm occurring across their entire service. Businesses will also have to provide user reporting mechanisms and an appeal procedure against content take down.
Only category 1 businesses will be required to take steps In relation to content or activity which is legal but harmful to adults. Category 1 businesses are likely to include Facebook, Twitter, TikTok and Instagram. They will be required to:
Content and activity which is legal but harmful to children
All companies in scope will be required to assess the likelihood of children accessing their services. Only services likely to be accessed by children will be required to provide additional protections for children using them. This will be in line with the ICO's Children's Code.
Additional protections will include carrying out a child safety risk assessment and identifying and implementing proportionate mitigations including age-appropriate protective measures.
Ofcom will be the regulator and its regulation will supposedly be paid for by firms within scope which are above an as yet unspecified threshold based on global annual revenue although the government says most businesses will not be required to pay a fee.
Ofcom will establish a super complaints mechanism (for representative organisations) and an advocacy mechanism. The government will establish a statutory appeals procedure.
Ofcom will be expected to take a proportionate risk-based approach but will have a range of enforcement powers to include:
Codes of practice
Codes of practice will be introduced setting out in more detail what systems and processes companies need to introduce to fulfil their statutory duty of care. Companies will be required to comply with the codes or demonstrate that any alternative approach they take is equally effective. The government has published interim codes on terrorism and CSEA although these are currently voluntary and non-binding.
We look at the data protection and privacy impacts of the newly drafted Digital Services Package, and the UK government's response to the CMA on digital advertising.
1 de 6 Publications
We look at the EU's draft Digital Markets Act and the UK's plans for a Digital Markets Unit.
2 de 6 Publications
The DSA may be some way off, but here's a round-up of what you can do now.
3 de 6 Publications
A review of the EC's proposals to facilitate public-sector data sharing and data altruism.
4 de 6 Publications
An overview of the EC's IP Action Plan, which proposes changes to the EU's regulation of IPRs.
5 de 6 Publications