6 von 7

7. März 2024

Online Safety Act - Part 2 – 6 von 7 Insights

UK Online Safety Act - Ofcom's draft guidance on identifying illegal content

Margarita Taliadoros looks at the implications of Ofcom's draft guidance on identifying illegal content regulated under the OSA.

Mehr
Autor

Margarita Taliadoros

Associate

On 9 November 2023, Ofcom published the first of four consultations on draft guidance and Codes of Practice on illegal harms under the new Online Safety Act (OSA). Volume 5 and Annex 10 of the consultation focus on how in-scope service providers are expected to make a judgment on whether content is illegal.

OSA test for illegal content

Illegal content is defined in the OSA as content that amounts to one of the 15 priority offences (such as terrorism offences, CSEA offences and other priority offences identified in Schedule 7) and any other offence of which the victim is an individual.

Service providers caught by the Act are required to take down content where they have "reasonable grounds to infer" that content is illegal. The reasonable grounds test is satisfied where the provider:

  • has reasonable grounds to infer that the conduct/behaviour element and mental element necessary for the commission of the offence, are present or satisfied, and
  • does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

It is worth noting that for content generated by a bot, the test will relate to the conduct or mental state of the person who may be assumed to control the bot or tool.

When making an illegal content judgment, service providers should have sufficient understanding of UK law and consider "reasonably available information". The guidance provides more information about what this means in practice.

Ofcom guidance on illegal content judgments

Ofcom's first consultation includes a 52-page document on illegal content judgments (Volume 5), as well as a 390-page draft guidance document (Annex 10) (guidance). When finalised, the guidance will help service providers understand how they should assess whether user-generated content on their platform is illegal, and so identify when they are required to take action.

Reasonable grounds to infer

The guidance clarifies that the "reasonable grounds to infer" standard is new and lower than the "beyond reasonable doubt" standard used by UK courts for criminal convictions. If a service provider finds that according to the lower "reasonable ground to infer" standard the relevant content is illegal, this does not equate to criminal liability for the user generating the content and it does not create an obligation on the provider to report the illegal content to law enforcement, unless it is Child Sexual Exploitation and Abuse (CSEA) content. It does, however, create an obligation for the provider to take action against the content in question.

The nature and context of the content being judged needs to be considered in each situation, as well as the particular offence in question.

Reasonably available information

Judgments on whether content is illegal under the OSA, for example for the purposes of risk and user empowerment assessments, should be made based on all relevant information reasonably available to the provider. According to the guidance, reasonably available information can include the information in the content in question itself, information obtained from third party complaints, information from the profile of the user as well as information about the activity before and after the content in question was uploaded.

The size and capacity of the service provider will be a relevant factor in determining what information Ofcom deems to be "reasonably available", and providers have been split into three groups: large, small and micro services. This means that the bigger, more sophisticated providers will be expected to take into account a wider range of factors than the information contained in the content itself.

Another relevant factor is whether judgments are made by humans, automatic systems or both. Ofcom's guidance has set out more information in terms of what is reasonably available to a service under each offence, and has chosen to adopt a "technology-agnostic approach" to reasonably available information. In practice this means that whether an automated system or a human makes the judgment will not impact the range of information which needs to be taken into account.

What does this mean for you?

Providers caught by the OSA will need to review the terms and processes they have in place to assess whether they are compliant with the duty to take action against illegal content under the Act, and if not, what changes they need to make. The guidance should help them do that but is not prescriptive so other factors may also be relevant.

Many service providers will already have terms of service and content moderation practices that go beyond what is required in the Act, for example by covering content not classified as illegal under the OSA.  The OSA does not require illegal content assessments in respect of content which is prohibited in a service's terms and conditions provided the prohibition is clearly communicated and moderation practices ensure prompt removal of prohibited content. So, for example, if a provider prohibits all sexual content as part of the terms of service and ensures its prompt removal, then the provider will not have to conduct illegal content judgments in respect of the sexual offences set out in the OSA. 

Service providers not falling within this exemption will have to make the necessary changes to ensure that illegal content is assessed and removed.  Existing in-scope providers will need to begin complying with these duties within three months of Ofcom publishing its final statement on its illegal content Codes of Practice and final guidance – currently expected to be towards the end of 2024.

You can access Part 1 of our Interface content on the OSA here, Part 2 here, and our full range of content on the OSA and the DSA here.

Zurück zur

Hauptseite

Zurück zur Interface Hauptseite