Autor

Debbie Heywood

Senior Counsel – Knowledge

Read More
Autor

Debbie Heywood

Senior Counsel – Knowledge

Read More

24. Mai 2021

Radar - May 2021 – 1 von 3 Insights

Out of harm's way? Online Safety Bill published

The government has published the draft Online Safety Bill but can it deliver on what was promised?

What's the issue?

Tackling online harms has become a priority around the world but this involves dealing with a wide range of issues arising not only from illegal activity, but also from activity which is lawful but harmful.  What exactly is harmful content and how to do you tackle it without infringing on free speech?

What's the development?

Having set out its approach to legislation in this area in December 2020 as we discussed here, the UK government has now published its long-awaited Online Safety Bill.  As expected, it introduces a statutory duty of care on certain online providers to protect their users from harm.  The regime will be overseen by Ofcom which will have a range of enforcement powers, including to fine businesses up to the greater of £18 million or 10% of annual revenue.

What does this mean for you?

If you are a provider of:

  • services which allow users to upload and share user generated content (UGC) – user-to-user services; or
  • search services, 
  • as defined in the draft legislation, you can now start to prepare.

Duties will be determined by the nature of the content and activity of the business in terms of whether it is illegal, harmful to children, or legal when accessed by adults but still harmful.

Businesses will be classed into categories according to the number of users of a service, its functionalities, and the risk of harmful content spreading.  The largest social media companies are likely to fall into Category 1, with most other businesses in scope being classed as Category 2.  Category 2A will be assessed by reference to a regulated search service's number of UK users, and Category 2B will depend on a user-to-user service's number of UK users and its functionalities.  The conditions for categorisation are listed in Annex 4.

While the draft legislation does provide more detail than was previously available, it will be supplemented by Codes of Practice to be developed by Ofcom, as well as secondary legislation which will further stipulate what constitutes harmful content.  The definitions in the Bill of harmful content, whether to children or adults, remain vague and will not be particularly helpful to service providers trying to prepare for the impact of the legislation.

There is considerable emphasis on protecting children (under-18s) online with different obligations for providers of user-to-user services "likely to be accessed by children".  This is the same language used in the ICO's Age Appropriate Design or Children's Code - it covers not only services targeted at children, but those likely to be accessed by them and requires a similar risk-assessment exercise.

In fact, a risk-based approach is prevalent throughout the Bill, alongside transparency, reporting and record-keeping requirements.  This places a considerable compliance burden on regulated services but also has the benefit of allowing some flexibility to account for different business models and levels of risk.  

The debate will continue as to whether the Bill sufficiently protects freedom of speech and expression, and whether it is sensible to place decision making about what content is harmful and how to mitigate risk largely in the hands of the service providers themselves. 

Given the difficulties in determining when content is harmful and to whom, even with yet to be published supplementary Codes of Practice and secondary legislation, it remains to be seen whether the legislation is workable, and how much of an impact it will have, positive or otherwise.  

This is a significant (and at 145 pages, a lengthy) piece of legislation so here are some of the highlights.  Please also join us at our webinar on 25 May when we'll be discussing the impact of the Bill in more detail.

Read more

What services are caught by the Bill?

The Bill covers:

  • services which allow users to upload and share user generated content (UGC) – user-to-user services; or
  • search services.

A range of exempted services are set out in Schedule 1, and include:

  • internal business functions (like message boards)
  • where the only UGC is email, SMS, or MMS, or one to one aural communications
  • limited functionality services, for example, comments, reviews, 'likes', emojis, 'yes and no' voting
  • public bodies.

Journalistic content including on news publisher websites, is explicitly protected under the 'freedom of speech' provisions which also cover personal rights to freedom of expression and privacy, and content of democratic importance.

What is the territorial reach?

The Bill will apply to the whole of the UK and to services based outside the UK where users in the UK are affected.  The duties of care only apply to the design and operation of the service in the UK and to users in the UK.

What is illegal content?

Illegal content is defined as any regulated content in relation to a user-to-user service which amounts to a relevant offence, or any content in relation to a regulated search service that amounts to a relevant offence.  

What constitutes regulated content is defined as UGC subject to exemptions for:

  • emails, MMS, SMS and one to one live aural communications
  • comments and reviews on provider content
  • paid-for adverts
  • news publisher content.

What is UGC?

UGC in relation to a user-to-user service, is defined as content:

  • generated by a user of the service or uploaded to or shared on the service by a user of the service, and
  • that may be encountered by another user or users of the service by means of the service.

This includes content generated by means of software, bots or other automated tools.

Most of the elements mentioned here are as further explained or defined.

What is content harmful to children?

This is defined in section 45 as content which is regulated content (UGC subject to exceptions) and which is designated as content harmful to children by secondary legislation or which:

  • the provider of the service has reasonable grounds to believe carries a material risk of having directly or indirectly, a significant adverse physical or psychological impact on a child of ordinary sensibilities; or
  • the provider of the service has reasonable grounds to believe there is a material risk of the content's dissemination having a significant adverse physical or psychological impact on a child of ordinary sensibilities, taking into account how many users may be assumed to encounter the content by means of the service and how quickly and widely it may be disseminated via the service.

There are exceptions and further elements to the definition.

What is content harmful to adults?

As set out in section 46, the criteria for assessment are similar to those for assessing when content is harmful to children but by reference to an adult of ordinary sensibilities.

Duty of care for all regulated user-to-user services

All regulated user-to-user services will have the following duties:

  • to carry out and maintain illegal content risk assessments
  • to take steps to mitigate and manage risks of harm caused by illegal content
  • duties relating to the protection of freedom of expression and privacy
  • providing a reporting and redress mechanism for users
  • keeping records to evidence carrying out of duties.

Additional duties for regulated user-to-user services likely to be accessed by children

  • to carry out children's risk assessments relating to harmful content
  • to protect children's online safety
  • additional reporting and redress duties in relation to content likely to be harmful to children.

Additional duties for Category 1 services

Category 1 services have additional duties including:

  • to carry out adult risk assessments in relation to harmful content
  • to protect adults' online safety
  • to protect users' rights to freedom and privacy
  • to protect journalistic content and content of democratic importance
  • additional reporting and redress duties in relation to content harmful to adults, and protection of journalistic content and content of democratic importance.

Duty of care for all regulated search service providers

All regulated search services have the following duties:

  • to carry out and maintain illegal content risk assessments
  • all the duties relating to illegal content
  • to protect freedom of expression and privacy
  • reporting and redress duties
  • record-keeping and review duties.

Where the search service is likely to be accessed by children, they must also comply with the additional duties relating to harmful content likely to be accessed by children.

How to meet the duty of care

Businesses must put in place systems and processes to assess and mitigate risk to individuals and to improve user safety in relation to the different types of content.  Part of this will involve assessing whether or not a service is likely to be accessed by children.  

As with the GDPR, compliance is an ongoing process and documenting compliance via risk assessments, in addition to the various record-keeping obligations, is a central pillar of the Bill.

Regulated businesses will be required to set up mechanisms allowing users to report harmful content and to appeal against takedown of content.  Category 1 and 2 services will also be required to publish transparency reports setting out the measures they have taken to tackle online harms.

Ofcom, fees, enforcement and Codes of Practice

Ofcom will be required to establish a register of services meeting Category 1 and 2 thresholds.

Companies above a threshold based on global annual revenue will have to notify Ofcom and pay an annual fee. The threshold is likely to be high enough to mean this will only apply to a small number of businesses.

Ofcom is required to produce Codes of Practice on terrorist and CSEA content, and on aspects of compliance with relevant duties.

As regulator, Ofcom has a range of enforcement powers, including issuing 'use of technology' warnings and notices requiring use of particular technology to assist with compliance, business disruption measures, and, ultimately, significant fines.  Senior managers may be criminally liable for failure to comply with information requests.

Next steps

The Bill now begins its path to enactment at which point a range of measures will come into force immediately, while others will be brought in by secondary legislation.

The legislation is sure to be the subject of debate but given the size of the government's majority in Parliament, its progress is likely to be relatively smooth, although it is unclear how long it will take to achieve Royal Assent.

In dieser Serie

Technology, Media & Communications

Out of harm's way? Online Safety Bill published

24. May 2021

von Debbie Heywood

Technology, Media & Communications

EU moves to ban high-risk AI

24. May 2021

von Debbie Heywood

Technology, Media & Communications

How should the law treat cryptoassets and other digital assets?

24. May 2021

von Debbie Heywood

Call To Action Arrow Image

Newsletter-Anmeldung

Wählen Sie aus unserem Angebot Ihre Interessen aus!

Jetzt abonnieren
Jetzt abonnieren

Related Insights

Gaming

Play

9. April 2024

von mehreren Autoren

Klicken Sie hier für Details
Technology, Media & Communications

EC Directive on Empowering Consumers for the Green Transition enters into force

21. März 2024
Quick read

von Debbie Heywood

Klicken Sie hier für Details
Technology, Media & Communications

Law Commission consults on regulation of autonomous aviation

21. März 2024
Briefing

von Debbie Heywood

Klicken Sie hier für Details