2023年11月20日
Radar - November 2023 – 3 / 3 观点
The UK's Online Safety Act (OSA) received Royal Assent on 26 October 2023. Ofcom's powers have come in immediately but most of the rest of the provisions will be brought into force in two months' time. Much of the detail around compliance under the OSA will be provided by codes of practice and guidance, and Ofcom has now published its first OSA consultation on protecting people from illegal harms online.
The OSA focuses on user-generated content (subject to limited exceptions) and applies to user-to-user services and search services as well as pornographic content services. The OSA regulates illegal content and certain specified types of harmful content, focusing especially on content harmful to children on services likely to be accessed by them. Terrorism and Child Sexual Exploitation and Abuse (CSEA) content are a particular focus, but a range of harmful content is also covered in specified circumstances. In relation to the most harmful type of content likely to be accessed by children, age verification/estimation must be used (subject to a limited exception).
The OSA applies to services which have links to the UK. Various safety duties apply to different categories of content. In order to establish what services need to do, they have to carry out a variety of risk assessments against Ofcom risk profiles. Service providers also have transparency requirements and obligations to provide redress and are likely to to have to amend their terms and conditions. There are wider duties to protect freedom of expression and the right to privacy including personal data.
Category 1 services (to be determined, but likely to be the larger social media services) and Category 2A and 2B services have additional duties. In particular, Category 1 services have expanded duties to protect fundamental freedoms including content of democratic importance and news publisher content. They also have to comply with adult user empowerment provisions which require them to give adults users options to prevent them encountering certain types of harmful content.
Ofcom is the regulator of the OSA. It has extensive powers and duties. It is responsible for producing initial risk profiles and a raft of codes of practice and guidance which will inform how service providers are supposed to comply with the OSA, as well as a range of reports on the impact and operation of the OSA. The process of introducing these (as set out in Ofcom's revised approach to implementing the OSA) is likely to take at least three years with everything subject to consultation and much of it dependent on the introduction of secondary legislation. Ultimately, Ofcom will have a range of enforcement powers including the ability to fine organisations the higher of up to £18m or 10% of global annual turnover. The OSA is very wide-ranging and Ofcom estimates that around 100,000 online services could be in scope.
Ofcom published its consultation on protecting people from illegal harms online on 9 November 2023. This is the first of four major consultations planned by Ofcom over the next 18 months under the OSA.
The consultation focuses on:
As part of this, Ofcom has also published first draft codes of practice and associated guidance which will form the bedrock of compliance with the illegal harms regime. These cover:
Helpfully Ofcom has provided a 15-page at a glance summary which is comprised mostly of a table setting out which obligations apply to different types of organisation classified by size and risk level, and a summary of each chapter which provide the best way in. Measures proposed for services are broken down into which measures are likely to apply to services based on their size, and the level of risk for illegal harms, classified by Ofcom as either low risk, specific risk, or multi-risk. Ofcom proposes defining a service as large where it has an average user base of more than 7 million per month in the UK (roughly 10% of the UK's population). Less helpfully, these classifications do not correlate to those for Category 1, 2A and 2B service providers which will be set out later under secondary legislation. Organisations will need to refer to the Service Risk Assessment Guidance (Volume 3 and Annex 5) to help them identify which risk level each relevant service falls into.
Volume 2 of the consultation sets out the causes and impacts of online harm. This is Ofcom's register of risks which form the basis for Ofcom's risk profiles (set out in Appendix A of Annex 5) and services are expected to refer to it when carrying out their own risk assessments relating to illegal harms. Ofcom highlights certain service types as playing a particularly prominent role in the spread of priority illegal content. Notably, it says that file-storage and file-sharing services and adult services pose a particularly high risk of disseminating Child Sexual Abuse Material (CSAM), and social media services play a role in the spread of an especially broad range of illegal harms.
Certain functionalities are also identified as posing particular risks, notably:
Ofcom has categorised priority illegal content into 15 categories:
A table setting out the individual offences set out in the OSA (including 130 priority offences) and the classification of the type of harm to which they relate is included in Annex B to in the Service Risk Assessment guidance (Annex 5).
As set out previously by Ofcom, it is proposing a four-step process for illegal content risk assessments:
A range of different governance, risk mitigation and content moderation measures are recommended, again differing according to size of organisation and nature of risk and the type of service (u2u or search), but recommended measures to help combat priority illegal harms in scope of the OSA include:
Ofcom suggests relevant services take targeted steps to combat CSEA, fraud and terrorism including:
The more onerous measures will be targeted only at services which are large and/or high risk. Guidance around measures to combat violence against women and girls online will not be published until early 2025 as Ofcom recognises it needs to do more work in that area.
Draft illegal content judgements guidance provides guidance to services on how to identify whether or not a piece of content is likely to be illegal. The guidance on the overall approach is 52 pages but the detail is set out in Annex 10 and runs to 390 pages.
Ofcom anticipates around 100,000 services will need to consider the Online Safety Act. While many of the safety measures will only apply to the larger and higher-risk services, all in-scope services, even the smaller, lower-risk ones, will have a range of obligations, including risk assessment and mitigation, report and complaints and record-keeping. Most businesses will also need to make changes to their terms of service. Ofcom has published an overview and quick guides for online services setting out ‘what you need to know’ to help them understand the first steps. Those impacted by the OSA will need to look at the consultation and associated documents in full which is no easy task as they run to over one thousand pages. Questions posed in the consultation are couched in broad terms and organisations which wish to respond need to do so by 5pm on 23 February 2024.
Understanding whether and to what extent you are in scope, assessing risk to users and putting in mitigation measures together with processes to comply with safety duties will be crucial. This is not least because of Ofcom's extensive enforcement powers.
Once the consultation on illegal harms closes, Ofcom will consider the responses, review its proposals and publish a statement setting out its final decisions together with final versions of the above guidance and codes of practice. The statement is expected in Winter 2024 and services will then have three months to conduct their risk assessments. The codes of practice will be subject to Parliamentary approval and are expected to come into force by the end of 2024. As such, in-scope service providers have some time to prepare for compliance around illegal harms, but now that they have more (albeit draft) detail, they can begin a more thorough assessment of what they will need to do.
The next phase of consultations will focus on child safety, pornographic content and protecting women and girls, and is expected in December 2023.
It's worth remembering that the UK is not the only country attempting to regulate the online environment. The EU's Digital Services Act (DSA), for example, covers similar but different ground. Businesses impacted by both sets of legislation will face a particularly complex set of compliance challenges.
For more on what is covered by the OSA, see our Interface edition. We'll also be holding a webinar on 5 December when we'll look at the impact of the OSA on affected businesses in light of the Ofcom consultation, as well as in the context of the EU's Digital Services Act. Register your interest here.
2023年11月20日
2023年10月27日
作者 Emma Allen
2023年11月20日