What's the issue?
As we discussed here, progress of the Online Safety Bill, the UK's draft legislation to regulate user generated content online, stalled at report stage after its second reading when Boris Johnson stepped down as Prime Minister in July 2022. In the meantime, the EU passed its Digital Services Act which looks to regulate similar although not identical issues.
What's the development?
The Online Safety Bill resumed its progress in December 2022, after a five month delay. The government proposed some significant amendments. Owing to the stage to which the Bill had already progressed, the government returned a limited number of clauses to the Public Bill Committee to allow them to go through line by line scrutiny. The clauses went back to the whole House for its third Report stage on 17 January 2023, and the Bill now moves on to the Lords where it is expected to face a large number of amendments. The plan is for the Bill to pass in this parliamentary session (ie by Spring 2023).
Changes in the republished Bill include:
- removing references to specific types of lawful but harmful content which social media platforms have to consider taking down following their definition by the Secretary of State. This will be replaced by a "triple shield" consisting of a legal requirement on social media platforms to remove illegal content, take down anything which breaches their own terms of service, and provide adults with greater choice over the content they see and engage with, including by using tools such as human moderating, blocking flagged content, sensitivity and warning screens
- new accountability and transparency requirements relating to social media policies and procedures, including a requirement to publish more information about potential risks to children and age verification measures
- requiring platforms to publish details of any Ofcom enforcement action against them
- adding additional commissioners with whom Ofcom will be required to consult when producing codes of conduct.
The Bill also contains new offences including:
- a criminal offence of controlling or coercive behaviour which will be added to the list of priority offences considered to constitute illegal content
- criminalisation of the encouragement of self harm
- new communications offences.
The draft harmful communications offence has been removed. The government considers this is a risk to free speech. In addition, there are new duties on major online platforms to prohibit them from removing or restricting user-generated content or suspending or banning users where this does not breach their terms of service or the law.
The government went on to publish a guide to the Online Safety Bill. It covers how the OSB will protect children and adults, the type of content that will be caught, and enforcement.
What does this mean for you?
The government has said it is not abandoning the idea of regulating lawful but harmful content, it is simply placing the responsibility for doing so on the platforms which host the content by asking them to enforce their terms of service. Ofcom is also in the frame. For those categories of content for which there are duties, Ofcom is required to produce guidance and examples of the type of content it considers are included.
The approach the government is now taking is arguably simpler in terms of understanding what is required under the OSB. It also appears to re-set the balance with regard to protecting free speech. However, it is unclear the extent to which it will be effective. Notably, different organisations have different policies about what kind of UGC is acceptable – we have only to look at the direction Twitter has taken since Elon Musk's takeover to understand that attitudes (and terms of use) can vary significantly between platforms.
Additionally, campaigners argue that the revised OSB lacks teeth and will fail to protect those most vulnerable to harmful online content. Measures tend now to focus on remediation rather than prevention and on users setting their own preferences to help filter out unwanted content – not necessarily an effective strategy for protecting the vulnerable. Since the revised Bill was published, a group of Tory rebel MPs took up the issue and forced a change to the Bill which will make senior managers at social media firms (and potentially other tech firms) criminally liable for serious and persistent breaches of their duty of care to children. This will not criminalise executives who have "acted in good faith to comply in a proportionate way".
The Labour party has said it will be seeking to amend the Bill to bring it more in line with the previous version ie to bring back the 'lawful but harmful' content provisions to the face of the legislation. If those efforts are unsuccessful, Labour has hinted at changing the law again should it win the next general election.
The Bill is expected to pass in the first quarter of the year. But, as we have said before, much of the detail will be set out by the Secretary of State and Ofcom. Perhaps even more so than under the previous version of the Bill.