30 July 2021
Download – Online Safety Bill – 5 of 6 Insights
A key weakness with the 2019 Online Harms White Paper – the precursor to the Online Safety Bill (OSB) – was the lack of concrete protection for the right of freedom of expression. This right, as enshrined in Article 10 of the European Convention on human rights, is "one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment" (Lingens v Austria).
The OSB's focus on preventing harm results in systems, policies and practices which are designed to identify and prevent harmful information from being published and to facilitate the prompt removal of harmful information which has already been published.
This potentially clashes with the right of freedom of expression. By monitoring and taking action against user information and users, the service provider will also potentially be interfering with a user's rights of privacy.
The OSB has specific clauses covering:
Are they enough to protect the fundamental rights and freedoms?
There is a duty on all user-to-user and search services, when deciding on and implementing safety policies and procedures, to have regard to the importance of protecting:
This is only a duty to have regard to these fundamental rights. Therefore, if a service provider can show that it has considered them at the appropriate time, it will probably have complied in this regard. Good record keeping (or at least a tick box) when making decisions which may affect users' free speech or privacy may help to demonstrate that the duty has been fulfilled.
The OSB does not explain what freedom of expression within the law means. The explanatory notes merely say this includes the common law; the law must be English law. Service providers will need to know what that law is to be able to have regard to it.
There is a substantial body of case law in England (and the ECtHR which the English courts must consider under the Human Rights Act 1998) about the nature and value of freedom of expression, including that:
Furthermore, Article 10 includes the freedom to hold, impart and receive information, opinions and ideas. Therefore, it is not only the potential right of the poster to post their content which comes into play, but also the rights of the community of users as a whole to receive the content, which forms part of this fundamental right.
As for the duty to have regard to protecting users from unwarranted infringements of privacy, the purpose is likely to protect users and their content from being spied on and interfered with by the service providers and action taken against them.
The right of privacy under Article 8 includes "correspondence" and can potentially include content posted online, depending on such things as whether the poster has a reasonable expectation of privacy.
Of course, some user-to-user communications will be deliberately public, but others may be private communications to one person or to a small group of friends or family. The User Generated Content (UGC) will also fall within the Data Protection Act 2018 as it is likely to be personal data. To be able to have regard to privacy, service providers will therefore need to understand the relevant laws, including the tort of misuse of private information and the Data Protection Act 2018.
There are additional duties on Category 1 service providers:
Category 1 service providers have additional duties to protect political speech, in particular "content of democratic importance" (CDI). CDI is content that:
The key question for the second condition is whether the content appears to be intended to contribute to democratic political debate. It is bound to cover debate about party politics and promoting or opposing political parties and national or local government policies.
But how wide does it go? And what about more extreme views? Would a controversial remark calling for immigration from certain countries to be banned contribute to democratic political debate or be so antithetical to democratic values that it does not even fall within the definition (notwithstanding the broad scope of the right of freedom of expression)? If the former, Category 1 services have certain duties (see below).
To be considered CDI appears to require that the debate happens in the UK but would cover debate over non-UK politics (eg US politics) but the wording is ambiguous.
If the second condition is satisfied, then most content will be caught since the definition of "Regulated Content" is broad. This refers to any UGC which is not exempt (like user comments and reviews on content produced and published by the service provider).
News publisher content is either content directly generated on the service by a recognised news service, or user content reproducing this content in full (eg a user reproducing in full a news publisher's article or recording).
Category 1 service providers have the following duties regarding CDI:
It may not be straightforward for service provides to ensure that their systems and decision making apply in the same way to a diversity of political opinion. For example, should they treat online discussions in favour of arguably xenophobic/nationalistic political parties in the same way as mainstream centrist parties who are anti-discrimination?
They will probably need a list of principles to try to ensure they treat diverse parties the same way by referencing the principles, rather than the parties themselves. They will also need be careful to be consistent with what they allow or take down.
Category 1 service providers have additional duties related to journalistic content. This is defined as content which is:
What is content generated for the purposes of journalism? Does it only mean "news-related material" which is defined for part of the definition of a "recognised news publisher"? That applies to material consisting of news, opinions or information about current affairs and gossip about celebrities, other public figures or other persons in the news.
Journalistic content is likely to be wider than that, as journalism generally encompasses more than news, current affairs and gossip about public figures. It is not clear whether it includes citizen journalism (posts by individuals who are not professional journalists making information available to the public about current events – eg when they happen to be present when an earthquake, terrorist attack or riot takes place). This will be one of the many questions which may be expanded in the Codes of Practice and/or by court cases.
The duties are:
Like with CDI, the obligation regarding journalistic content when making decisions on whether to take down content or take action against users is to ensure that service providers properly consider the importance of the free expression of journalistic content. Again, good record keeping before and at the decision-making stage may help demonstrate the obligation has been fulfilled.
The Codes of Practice – which Ofcom must produce with recommended steps for compliance with the relevant duties – will need to address the duties concerning freedom of expression, privacy, CDI and journalistic content. Before preparing the Codes, Ofcom must consult with various persons who represent different interests and/or have certain expertise. This includes persons whom Ofcom considers to have relevant expertise in equality issues and human rights, including the right to freedom of expression under Article 10 ECHR and privacy under Article 8 ECHR.
The Bill ensures that freedom of expression and privacy and, for Category 1 services, CDI and journalistic content, are given attention. Whether this is enough to protect these fundamental rights may depend on the Codes of Practice and ultimately on the service providers in question.
The key test will be whether these rights are sufficiently protected at the decision-making stage where allegedly illegal and allegedly legal but harmful content, as well as users who post it, are concerned.
The obligations to "have regard to" these rights or ensure they are "taken into account" do not seem onerous. Weighed up against the duties to protect children and adults from harm, it is possible that freedom of expression and user privacy will take second place in at least borderline cases. Service providers will need to have a good understanding of these fundamental rights to uphold them, especially in the face of offensive but legal content.
It is not clear why CDI and journalistic content are only given protection within Category 1 services. Perhaps it was thought that smaller platforms do not have the resources to cope with these additional duties. However, genuine political speech and journalism should be adequately protected across the board, not only on the bigger platforms. Having said that, as these types of speech are manifestations of the right of freedom of expression – to which the duties apply across all services – the distinctions may be academic.
To discuss the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Communications team.
by Xuyang Zhu