18 January 2024
Radar - January 2024 – 2 of 2 Insights
Secondary legislation brings large parts of Online Safety Act into force with the EU's DSA due to come into full effect in mid-February.
The UK's Online Safety Act and some of its provisions came into force on 26 October 2023 but large parts of it were left to be brought in by secondary legislation. Sections 114(2) and (7) came into force on 21 November 2023, under the Online Safety Act 2023 (Commencement No.1) Regulations 2023.
Meanwhile, the EU's Digital Services Act which covers similar but different ground as we discuss here, has applied for the most part to very large online platforms (VLOPs) and very large online search engines (VLOSEs) since August 2023, and will apply to other companies from 17 February 2024.
The Online Safety Act 2023 (Commencement No.2) Regulations 2023 bring into force all remaining provisions of the OSA from 10 January 2024 except for ss101 and 102 relating to information in connection with investigations into the death of a child which will come into force under these Regulations on 1 April 2024, AND the following provisions in respect of which commencement provisions have not yet been made:
section 210 (Repeal of Part 4B of the Communications Act).
The Digital Services Act is the EU legislation aimed at tackling illegal online content. It covers similar but different ground to the OSA (as we discuss here), but many organisations will be caught by both.
Over the course of 2023, there were a number of developments to help businesses with DSA compliance.
The European Commission published Q&A non-binding guidance at the end of January to help VLOPs and VLOSEs publish their average monthly active service recipients on their sites, as required for the purposes of their designation under the DSA. A full list of VLOPs and VLOSEs is maintained by the Commission here.
The European Commission adopted an implementing Regulation in June on detailed arrangements for the conduct of certain proceedings by the Commission – namely investigation and enforcement activities relating to VLOPs and VLOSEs – under Article 83 of the DSA.
A DSA Transparency database was launched by the Commission at the end of September. The Commission announced the opening of a European Centre for Algorithmic Transparency (ECAT). The ECAT will assess algorithms used by VLOPs and VLOSEs under the DSA. This will include looking at transparency reports and risk self-assessments submitted by the relevant companies, to assess whether they comply with requirements under the DSA, and carrying out inspections into the systems using the algorithms where required to do so by the Commission.
The EC adopted a Delegated Regulation in October, with rules on independent audits to assess compliance of VLOPs and VLOSEs with the DSA. It sets out the steps designated services must apply to verify the capabilities and independence of their auditor, and the main principles auditors should apply when performing DSA audits. The first audits will be published in August 2024.
Amazon and German fashion company Zalando are challenging their designations before the ECJ amidst a lack of clarity under the DSA. On 27 September 2023, the European General Court delivered an interim decision in the legal dispute between Amazon and the European Commission, agreeing to pause the implementation of DSA obligations related to creating and disclosing an advertising repository and offering users a non-profiling-based option for each recommendation system until the Court decides whether Amazon is a VLOP.
Meanwhile German consumer protection organisations are reportedly already considering bringing class actions with respect to non-compliance with the DSA. Questions have also been raised about the Commission's readiness – it has to sign a number of cooperation agreements by 17 February 2024, when compliance by smaller companies will begin.
The first information requests under the DSA were sent out by the Commission in October 2023. The EC sent X (formerly Twitter) a request for information to assess its compliance, particularly with regard to its policies and actions regarding notices on illegal content, complaint handling, risk assessment and risk mitigation measures. X was required to provide requested information on the activation and functioning of its crisis response protocol by 18 October, and information relating to other questions by 30 October. The European Commission announced on 18 December 2023, that it was opening formal proceedings to assess whether X has breached its obligations under the DSA around:
dark patterns.
The Commission is now evidence gathering but there is no deadline by which it needs to conclude its investigation.
The EC has also requested information on a variety of issues from TikTok, Snap and Meta.
In December, the EC launched its Digital Services Terms and Conditions database: a database consisting of 790 sets of terms and conditions from more than 400 services provided by over 290 distinct service providers including Apple, Meta and Microsoft. The entries include a variety of documents like Commercial Terms, Developer Terms, Live Policy, Terms of Service, Privacy Policy, offering a comprehensive view of the digital legal landscape. The Commission says that this is just the beginning - through crowdsourcing with verified users, these numbers will continue to rise, ensuring an ever-expanding resource.
The European Commission launched a consultation on 8 December 2023 on the templates for content moderation activity reports. The draft implementing Regulation includes a quantitative and qualitative template and instructions for completing them. Information must be broken down by calendar month. The reporting period is per calendar year (although the first period will only begin on 17 February when the DSA comes into effect) with VLOSEs and VLOPs having to report every six months. The templates do not have to be used until 30 June 2024. The consultation closes on 24 January 2024.
See here for more about the DSA.
To some extent, the entry into force of large parts of the OSA is a technical development because much of the application of the OSA will be based around Ofcom's final guidance and codes of practice and there has been no designation of categorised services yet.
On 11 January, Ofcom published an information note on New rules for online services: what you need to know which suggests that in addition to responding to Ofcom consultations and to Ofcom information requests (where received), the only immediate compliance point is for providers of user-to-user services under s72(1). They should consider whether they need to make changes to terms of services to comply with s72(1) which requires terms of service inform users in clear accessible language of their right to bring a breach of contract claim if:
The user is suspended or banned from using the service in breach of the terms of service.
Ofcom says other changes to terms of service may be needed when more of the new rules to protect users from illegal harms online come into force in late 2024. This is the only direct compliance action highlighted as potentially required now under the OSA by Ofcom.
The first phase of guidance, focusing on illegal harms duties was published in draft for consultation on 9 November 2023, with more to come on child safety and protecting women and girls in spring this year. It will be a three year process before the regime is fully operational although that does not mean businesses caught by the OSA can put off preparing for compliance. See here for more on what to expect from Ofcom and the compliance timeline, and here for more on the OSA generally.
Meanwhile those caught by the DSA are subject to a more advanced regime although, as outlined above, there are areas with lack of clarity and doubts about the Commission's ability to get what it needs to do done by 17 February this year.
This makes for an uncertain environment for business caught by one or both of these pieces of legislation.
18 January 2024
by Debbie Heywood and Daniel Hirschfield