A key weakness with the 2019 Online Harms White Paper – the precursor to the Online Safety Bill (OSB) – was the lack of concrete protection for the right of freedom of expression. This right, as enshrined in Article 10 of the European Convention on human rights, is "one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment" (Lingens v Austria).
The focus on protecting users from illegal and some types of harmful user-generated content (UGC) will result in systems, measures, policies and practices which are designed to identify and prevent certain types of UGC from being published and to facilitate its prompt removal if it has already been published. This potentially clashes with the right of freedom of expression. By monitoring and taking action against user information and users, the service provider will also potentially be interfering with a user's rights of privacy. As a result of concerns raised that the original OSB did not do enough to protect fundamental rights and freedoms, protections for freedom of expression and the right to privacy as well as for related content, were enhanced over the course of the Bill's passage.
What is now the Online Safety Act (OSA) following Royal Assent, has specific clauses covering:
Are these provisions enough to protect the fundamental rights and freedoms?
There is a duty on all user-to-user services, when deciding on and implementing safety measures and policies, to have particular regard to the importance of protecting:
Search services have similar obligations which also extend to protecting the freedom of expression of "interested persons" – a person (or business) located in the UK and responsible for a searchable website or database.
Category 1 services have additional duties to carry out and publish up to date assessments of the impact their safety measures and policies may have on the rights to freedom of expression and privacy and to specify in a public statement the positive steps taken in response to their impact assessments to protect users' rights of freedom of expression and privacy.
There is only a duty to have particular regard to these fundamental rights. Therefore, if a service provider can show that it has considered them with sufficient thought and at the appropriate time, it will probably have complied. Good record keeping when making decisions which may affect users' free speech or privacy may help to demonstrate that the duty has been fulfilled. Service providers will be treated as complying with their duties regarding freedom of expression and privacy if they take or use the relevant recommended measures to incorporate safeguards to protect users' rights.
Freedom of expression
The OSA does not explain what freedom of expression within the law means. The explanatory notes merely say this includes the common law; the law must be English law. Service providers will need to know what that law is to be able to have regard to it.
There is a substantial body of case law in England (and the ECHR which the English courts must consider under the Human Rights Act 1998) about the nature and value of freedom of expression, including that:
Furthermore, Article 10 ECHR includes the freedom to hold, impart and receive information, opinions and ideas. Therefore, it is not only the potential right of the poster to post their content which comes into play, but also the rights of the community of users as a whole to receive the content, which forms part of this fundamental right.
Privacy
The right of privacy under Article 8 ECHR includes "correspondence" and can potentially include content posted online, depending on such things as whether the poster has a reasonable expectation of privacy.
Of course, some user-to-user communications will be deliberately public, but others might be private communications to one person or to a small group. In any event, user-generated content will also fall within the UK GDPR and Data Protection Act 2018 as it is likely to be personal data. To be able to have particular regard to privacy, service providers will therefore need to understand the relevant laws, including the tort of misuse of private information and data protection law which is expressly mentioned in the OSA.
Category 1 service providers have additional duties to protect political speech, in particular "content of democratic importance" (CDI). CDI is content that:
NPC is either content directly generated on the service by a recognised news publisher (defined in s50), or user content reproducing or linking to a full article, written item or recording published by a recognised news publisher.
The key question for the second condition is whether the content is or appears to be intended to contribute to democratic political debate.
It is bound to cover debate about party politics and promoting or opposing political parties and national or local government policies. But how wide does it go? And what about more extreme views? Would a controversial remark calling for immigration from certain countries to be banned contribute to democratic political debate or be so antithetical to democratic values that it does not even fall within the definition (notwithstanding the broad scope of the right of freedom of expression)? If the former, Category 1 services have certain duties (see below).
To be considered CDI appears to require that the debate happens in the UK but would cover debate over non-UK politics (eg US politics), however the wording is ambiguous.
If the second condition is satisfied, then a wide range of content will be caught since the definition of "regulated user-generated content" is broad, covering any UGC which is not explicitly exempt.
Category 1 service providers have the following duties regarding CDI:
It may not be straightforward for service providers to ensure that their systems and decision making apply in the same way to a diversity of political opinion. For example, should they treat online discussions in favour of arguably xenophobic/nationalistic political parties in the same way as mainstream centrist parties who are anti-discrimination?
Service providers will probably need a list of principles to try to ensure they treat diverse parties the same way. They will also need to be careful to be consistent with what they allow or take down.
Category 1 service providers have specific duties to protect NPC. This allows them to take action in the form of taking down content, restricting users' access to content or adding warning labels (except warning labels normally encountered only by child users). Other actions are allowed where acting on a relevant term of service. Service providers can also take action against a person including by warning, suspending or banning them from using a service or restricting their ability to use it.
Before taking action to protect NPC or against a recognised news publisher, service providers are required to comply with various notification and information obligations. If they fail to do so, they have to act swiftly to make the required notifications and provide a reasonable time period in which an application to reverse the action can be made. There are exemptions from the notification requirements where the service provider reasonably considers it may incur criminal or civil liability, or the NPC amounts to a relevant offence, or where the news publisher or the NPC content in question has already been banned from the relevant service.
Category 1 service providers have duties to protect journalistic content. This is defined as content which is:
What is content generated for the purposes of journalism? Does it only mean "news-related material" which is defined for part of the definition of a "recognised news publisher"? That applies to material consisting of news, opinions or information about current affairs and gossip about celebrities, other public figures or other persons in the news.
Journalistic content is likely to be wider than that, as journalism generally encompasses more than news, current affairs and gossip about public figures. It is not clear whether or the extent to which it includes citizen journalism (posts by individuals who are not professional journalists making information available to the public about current events – eg when they happen to be present when an earthquake, terrorist attack or riot takes place or just providing information or comment about current affairs). The UK ICO's draft Code on Journalism and data protection says the more something resembles the activities traditionally carried out by the mainstream media or other clear sources of journalism, the more likely it is to be journalism. The same is likely to be the case in the context of online safety. This issue may be expanded in Ofcom's codes of practice and/or by court cases.
The duties to protect journalistic content include:
Like with CDI, the obligation regarding journalistic content when making decisions on whether to take down content or take action against users is to ensure that service providers properly consider the importance of the free expression of journalistic content. Again, good record keeping before and at the decision-making stage may help demonstrate the obligation has been fulfilled.
Ofcom is required to produce codes of practice and guidance to assist with OSA compliance. One or more codes must be produced on all duties which apply to regulated service providers. This will therefore need to address the duties concerning freedom of expression, privacy, CDI and journalistic content. In the course of preparing the codes, Ofcom must consult with various persons who represent different interests and/or have certain expertise. This includes persons whom Ofcom considers have relevant expertise in equality issues and human rights, including the right to freedom of expression under Article 10 ECHR and privacy under Article 8 ECHR
In addition, Ofcom also has a duty to produce and publish a report assessing various factors around age assurance. In doing this it must consider a service provider's need to protect users from a breach of any statutory provision or rule of law concerning privacy.
The OSA also includes an obligation on Ofcom to state in its annual report the steps it has taken and the processes it has operated to ensure its online safety functions have been exercised in a manner compatible with Articles 8 and 10 ECHR.
The OSA ensures that freedom of expression and privacy and, for Category 1 services, CDI, NPC and journalistic content, are given attention. Whether this is enough to protect these fundamental rights may depend on the relevant codes of practice and ultimately on the service providers in question.
The key test will be whether these rights are sufficiently protected at the decision-making stage in relation to potential action against allegedly illegal and the types of allegedly legal but harmful content covered by the OSA, as well as against the users who posted it.
The obligations to "have particular regard to" these rights or ensure they are "taken into account" do not seem onerous. Weighed up against the duties to protect children and adults from illegal content and specified harms, it is possible that freedom of expression and user privacy will take second place in at least borderline cases. Service providers will need to have a good understanding of these fundamental rights to uphold them.
It is not clear why CDI, NPC and journalistic content are only given protection within Category 1 services. Perhaps it was thought that smaller platforms do not have the resources to cope with these additional duties or the power and influence to make a big difference. However, genuine political speech and journalism should arguably be adequately protected across the board, not only on the bigger platforms. Having said that, as these types of speech are manifestations of the right of freedom of expression – to which the duties apply across all services – the distinctions may be academic.
Louise Popple provides a table summary of the main obligations under the OSA.
1 of 9 Insights
Louise Popple looks at the range of businesses caught within the scope of the OSA.
2 of 9 Insights
Xuyang Zhu and Danielle Owusu give an overview of safety duties in relation to the different types of illegal and harmful content covered by the OSA.
3 of 9 Insights
Megan Lukins looks at the application of the OSA to user-to-user content likely to be accessed by children.
4 of 9 Insights
Debbie Heywood looks at Ofcom's wide range of duties and powers under the Online Safety Act.
5 of 9 Insights
Debbie Heywood looks at what to expect from Ofcom as its powers under the Online Safety Act commence.
6 of 9 Insights
Miles Harmsworth takes a high level look at some of the key overlaps and differences that in-scope digital service providers will need to consider under both regimes.
7 of 9 Insights
Mark Owen looks at requirements to carry out risk assessments under the OSA.
8 of 9 Insights
Return to