作者

Debbie Heywood

Senior Counsel – Knowledge

Read More
作者

Debbie Heywood

Senior Counsel – Knowledge

Read More

2023年3月27日

Radar - March 2023 – 2 / 4 观点

Ofcom begins outlining approach to risk assessments under the Online Safety Bill

What's the issue?

The Online Safety Bill (OSB) is currently at Committee Stage in the House of Lords and heading towards its final stages.  Under the OSB, all regulated firms will be required to do a risk assessment of illegal content that may appear on their service.  Services likely to be accessed by children will also have to assess the risk of content which is harmful to children. 

Ofcom's role with regard to risk assessments is to provide guidance on carrying them out, including by explaining what type of content needs to be covered, how harmful content might appear on a service and good risk management practice as a part of service design, organisational culture, and strong governance. 

What's the development?

Ofcom has set out its planned approach to risk assessments under the Online Safety Bill

Ofcom's proposed approach to risk across the online safety regime will be framed to achieve that:

  • risk assessments are an integral part of broader risk management processes and embedded within an organisation's existing risk management structures
  • responsibilities for risk management are clearly specified and owned at the most senior levels
  • risk management activities are regularly reported to senior decision-makers and independently scrutinised, where possible
  • risk controls are assessed for effectiveness, and emerging risks are monitored.

Ofcom says its guidance will cover the kinds of evidence to be considered in risk assessments and what is likely to meet the requirement that assessments are "suitable and sufficient" for different types of organisation – larger services are likely to have a higher bar to meet in this respect.  To that end, Ofcom plans to outline an additional set of evidence inputs for services which need to consider a range of sources of evidence to inform their risk assessments.

While recognising there is no 'one size all' approach, Ofcom says a good risk assessment should help a service anticipate and address the ways in which their users could be exposed to greater risks of harmful content.  They should ask questions like:

  • How does the service's user base affect this risk; for example, do large numbers of child users in the UK increase the risk of exploitation?
  • How do the functionalities of the service affect risk; for example, does offering stranger paring increase the risk of romance fraud?
  • What effect does the service's business model have; for example, how can a service's financial incentives under a given revenue model increase the risk of hosting harmful content?

Ofcom has developed a four-step process which can be applied by services of all types and sizes:

  • Step one: establish the context – establish the risks of harm that need to be assessed,  Consult the risk profiles produced by Ofcom which set out its assessment of key risk factors, and identify any gaps in understanding and evidence.
  • Step two: assess the risksreview evidence about the relevant platform and associated risks.  Assess the likelihood of harmful content appearing and the severity/impact of harm.  In addition, evaluate existing mitigating measures.
  • Step three: decide measures and implement – decide how to comply with safety duties, including through Ofcom's Codes of Practice.  Identify which measures need to be implemented, implement them, and record the outcomes of the risk assessment.
  • Step four: review and updatereport via relevant governance structures.   Monitor the effectiveness of mitigation measures.  Put in place regular review periods for assessments, recognising any triggers which might require revisitation before the next review.

Going forward, Ofcom says it is working with service providers and regulatory counterparts to help improve risk assessment coherence under different regimes, notably, the EU's Digital Services Act. 

Ofcom plans to launch consultations on its risk assessment guidance on illegal content and on children's risk assessments, as soon as its powers under what will become the Online Services Act, have commenced.  Ofcom also plans to publish a sector-wide register of risk assessing the risks of harm presented by illegal content on user-to-user and search services, and risk profiles which will set out key risk factors services should take account of when they conduct their assessments.  It will also produce Illegal Content Judgments Guidance to explain the offences covered by the OSB and help services make judgments about whether content is illegal content. 

What does this mean for you?

Much of Ofcom's approach to risk assessments has been informed by its role under other principles-based legislation, as well as by a wide ranging literature review.  It says it has learned from a review of best practice and industry standards, that good risk management is not a single process but a broader approach by companies which puts risk-awareness at the forefront of decision making – a culture or risk-awareness and prioritisation by all teams across an organisation.  Video Service Providers should note Ofcom's recommendation that they complete risk assessments even though not required to do so under the current VSP Regulations.  Ofcom also refers to the importance of internationally recognised risk governance standards (eg ISO 31000 and the Three Lines Model) in helping with a risk-focused culture as a fundamental part of an organisation's governance and leadership.

Ofcom cannot begin consulting on more detailed risk assessment guidance until the OSB becomes law, but this document sets out a framework for its priorities and the direction it is likely to take.  Impacted service providers can begin steps to set up reporting and review processes now, even if the detail of what will be required is not yet clear.

Once Ofcom's guidance has been finalised, relevant services will have three months to complete their first illegal content risk assessments.

本系列内容

技术、媒体与通信 (TMC)

Take two for the DPDI Bill

Debbie Heywood looks at the latest proposals for changing UK data privacy law following the publication of a second Data Protection and Digital Information Bill.

2023年3月27日

作者 Debbie Heywood

技术、媒体与通信 (TMC)

ICO publishes updated guidance on AI and data protection

2023年3月27日

作者 Debbie Heywood

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

游戏业务团队

Play

2024年4月9日

作者

点击此处了解更多
技术、媒体与通信 (TMC)

EC Directive on Empowering Consumers for the Green Transition enters into force

2024年3月21日
Quick read

作者 Debbie Heywood

点击此处了解更多
技术、媒体与通信 (TMC)

Law Commission consults on regulation of autonomous aviation

2024年3月21日
Briefing

作者 Debbie Heywood

点击此处了解更多