A key weakness with the 2019 Online Harms White Paper – the precursor to the Online Safety Bill (OSB) – was the lack of concrete protection for the right of freedom of expression. This right, as enshrined in Article 10 of the European Convention on human rights, is "one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment" (Lingens v Austria).
The OSB's focus on preventing harm results in systems, measures, policies and practices which are designed to identify and prevent harmful information from being published and to facilitate the prompt removal of harmful information which has already been published.
This potentially clashes with the right of freedom of expression. By monitoring and taking action against user information and users, the service provider will also potentially be interfering with a user's rights of privacy.
The OSB has specific clauses covering:
Are they enough to protect the fundamental rights and freedoms?
There is a duty on all user-to-user and search services, when deciding on and implementing safety measures and policies, to have regard to the importance of protecting:
This is only a duty to have regard to these fundamental rights. Therefore, if a service provider can show that it has considered them at the appropriate time, it will probably have complied in this regard. Good record keeping (or at least a tick box) when making decisions which may affect users' free speech or privacy may help to demonstrate that the duty has been fulfilled.
The OSB does not explain what freedom of expression within the law means. The explanatory notes merely say this includes the common law; the law must be English law. Service providers will need to know what that law is to be able to have regard to it.
There is a substantial body of case law in England (and the ECHR which the English courts must consider under the Human Rights Act 1998) about the nature and value of freedom of expression, including that:
Furthermore, Article 10 ECHR includes the freedom to hold, impart and receive information, opinions and ideas. Therefore, it is not only the potential right of the poster to post their content which comes into play, but also the rights of the community of users as a whole to receive the content, which forms part of this fundamental right.
As for the duty to have regard to the importance of protecting users from a breach of privacy (whether statutory or otherwise) , the purpose is likely to protect users and their content from being spied on and interfered with by the service providers and action taken against them for example. Since privacy includes laws on the processing of personal data, it will cover the rights and remedies of users under data protection law (eg under the UK GDPR and Data Protection Act 2018).
The right of privacy under Article 8 ECHR includes "correspondence" and can potentially include content posted online, depending on such things as whether the poster has a reasonable expectation of privacy.
Of course, some user-to-user communications will be deliberately public, but others might be private communications to one person or to a small group. In any event, User Generated Content (UGC) will also fall within the UK GDPR and Data Protection Act 2018 as it is likely to be personal data. To be able to have regard to privacy, service providers will therefore need to understand the relevant laws, including the tort of misuse of private information and data protection law.
The OSB also includes an obligation on Ofcom to state in its annual report the steps it has taken and the processes it has operated to ensure its online safety functions have been exercised in a manner compatible with Articles 8 and 10 ECHR.
Category 1 services
There are additional duties on Category 1 service providers:
Content of democratic importance
Category 1 service providers have additional duties to protect political speech, in particular "content of democratic importance" (CDI). CDI is content that:
The key question for the second condition is whether the content appears to be intended to contribute to democratic political debate.
It is bound to cover debate about party politics and promoting or opposing political parties and national or local government policies. But how wide does it go? And what about more extreme views? Would a controversial remark calling for immigration from certain countries to be banned contribute to democratic political debate or be so antithetical to democratic values that it does not even fall within the definition (notwithstanding the broad scope of the right of freedom of expression)? If the former, Category 1 services have certain duties (see below).
To be considered CDI appears to require that the debate happens in the UK but would cover debate over non-UK politics (eg US politics), however the wording is ambiguous.
If the second condition is satisfied, then most content will be caught since the definition of "regulated user-generated content" is broad. This refers to any UGC which is not exempt (like user comments and reviews on content produced and published by the service provider).
News publisher content is either content directly generated on the service by a recognised news service, or user content reproducing this content in full (eg a user reproducing in full a news publisher's article or recording).
Category 1 service providers have the following duties regarding CDI:
It may not be straightforward for service providers to ensure that their systems and decision making apply in the same way to a diversity of political opinion. For example, should they treat online discussions in favour of arguably xenophobic/nationalistic political parties in the same way as mainstream centrist parties who are anti-discrimination?
Category 1 service providers also have a duty to ensure their terms of service are applied consistently in relation to content which they "reasonably consider" is of democratic importance.
Service providers will probably need a list of principles to try to ensure they treat diverse parties the same way by reference to the principles. They will also need be careful to be consistent with what they allow or take down.
Category 1 service providers have additional duties related to journalistic content. This is defined as content which is:
What is content generated for the purposes of journalism? Does it only mean "news-related material" which is defined for part of the definition of a "recognised news publisher"? That applies to material consisting of news, opinions or information about current affairs and gossip about celebrities, other public figures or other persons in the news.
Journalistic content is likely to be wider than that, as journalism generally encompasses more than news, current affairs and gossip about public figures. It is not clear whether or the extent to which it includes citizen journalism (posts by individuals who are not professional journalists making information available to the public about current events – eg when they happen to be present when an earthquake, terrorist attack or riot takes place or just providing information or comment about current affairs ). The UK ICO's draft Code on Journalism and data protection says the more something resembles the activities traditionally carried out by the mainstream media or other clear sources of journalism, the more likely it is to be journalism. The same is likely to be the case in the context of online safety. This issue may be expanded in the Codes of Practice and/or by court cases.
The duties include:
Like with CDI, the obligation regarding journalistic content when making decisions on whether to take down content or take action against users is to ensure that service providers properly consider the importance of the free expression of journalistic content. Again, good record keeping before and at the decision-making stage may help demonstrate the obligation has been fulfilled.
Codes of Practice
The Codes of Practice – which Ofcom must produce and issue with recommended steps for compliance with the relevant duties – will need to address the duties concerning freedom of expression, privacy, CDI and journalistic content. In the course of preparing the Codes, Ofcom must consult with various persons who represent different interests and/or have certain expertise. This includes persons whom Ofcom considers have relevant expertise in equality issues and human rights, including the right to freedom of expression under Article 10 ECHR and privacy under Article 8 ECHR.
Are fundamental rights and freedoms sufficiently protected?
The Bill ensures that freedom of expression and privacy and, for Category 1 services, CDI and journalistic content, are given attention. Whether this is enough to protect these fundamental rights may depend on the Codes of Practice and ultimately on the service providers in question.
The key test will be whether these rights are sufficiently protected at the decision-making stage in relation to potential action against allegedly illegal and allegedly legal but harmful content, as well as against the users who posted it.
The obligations to "have regard to" these rights or ensure they are "taken into account" do not seem onerous. Weighed up against the duties to protect children and adults from harm, it is possible that freedom of expression and user privacy will take second place in at least borderline cases. Service providers will need to have a good understanding of these fundamental rights to uphold them, especially in the face of offensive but legal content.
It is not clear why CDI and journalistic content are only given protection within Category 1 services. Perhaps it was thought that smaller platforms do not have the resources to cope with these additional duties or the power and influence to make a big difference. However, genuine political speech and journalism should arguably be adequately protected across the board, not only on the bigger platforms. Having said that, as these types of speech are manifestations of the right of freedom of expression – to which the duties apply across all services – the distinctions may be academic.
To discuss the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Communications team.
Louise Popple looks at the range of businesses caught within the scope of the OSB.
1 de 6 Publications
Xuyang Zhu looks at the scope of content in the OSB and at how to comply with the related safety duties.
2 de 6 Publications
Mark Owen looks at the role of risk assessments and risk profiles in determining the scope and application of the Online Safety Bill.
3 de 6 Publications
Jo Joyce and Alex Walton look at the Online Safety Bill and recent guidance on protecting children online.
5 de 6 Publications
Debbie Heywood looks at Ofcom's wide range of duties and powers under the Online Safety Bill.
6 de 6 Publications