Autor

Debbie Heywood

Senior Counsel – Knowledge

Read More
Autor

Debbie Heywood

Senior Counsel – Knowledge

Read More

11. Dezember 2023

Radar - December 2023 – 1 von 2 Insights

Radar - 2023 roundup

We bring you our annual look back at the main regulatory developments of 2023 in the UK and at EU level, focusing on:

  • AI
  • Technology
  • Online safety
  • Data and cyber security
  • Games and eGaming
  • Consumer protection and product liability
  • Advertising
  • Other legislative developments in and related to digital, tech and media.

Find out what we expect in some of these sectors during 2024 in our Predictions articles featured on our tech and media portal, Interface, and keep an eye on new and incoming legislation in these areas with our Digital Legislation Tracker.

AI

AI has arguably been one of the defining issues of 2023 as the technology goes mainstream and the world struggles with how to regulate it.  Here we cover some major UK and EU legislative and regulatory developments this year and a small selection of global initiatives.  You can read about our predictions for the year ahead here and see all our AI-related articles here.

UK

The UK government published its White Paper – 'A pro-innovation approach to AI regulation', which sets out a framework for the UK's approach to regulating AI.  The government decided not to legislate to create a single function to govern the regulation of AI.  It has instead elected to support existing regulators develop a sector-focused, principles-based approach.  Regulators including the ICO, the CMA, the FCA, Ofcom, the Health and Safety Executive and the Human Rights Commission will be required to consider five principles to build trust and provide clarity for innovation and will issue non-statutory guidance.  Read more here.

In response to the White Paper and to other issues raised by AI, there were a number of inquiries and reports published in the UK, as well as moves by government departments to help with AI regulation during 2023, including:

  • The House of Commons Weapons System Select Committee launched an inquiry and call for evidence on the use of AI in weapons systems.
  • The CMA launched an initial review of competition and consumer protection considerations in the development and use of foundation models in May 2023.  In September, it published a report setting out its views and recommending principles to guide market development.  And it  announced it will be publishing an update in March 2024.
  • In June, following recommendations in the Vallance report, the government recognised that more work is needed to balance the rights of copyright holders with the needs of AI developers.  As a result, the IPO began work to develop a voluntary code of practice for copyright and AI.  The government will consider legislating if a code of practice is not agreed upon and widely used on a voluntary basis.
  • The House of Lords Communications and Digital Committee launched an inquiry into large language models and what needs to happen in the next one to three years to ensure the UK can respond to opportunities and risks.
  • The Department for Science, Innovation and Technology (DSIT) announced the disbandment of the AI Council and replaced it with a "wider group of expert advisers" which will input on a range of priority issues and complement the Foundation Model Taskforce established earlier in the year.
  • In July, the House of Lords published a report on AI: Developments, risks and regulation.
  • In July, the Committee of Standards in Public Life wrote to the Minister for AI and IP requesting an update on government progress against recommendations made in its 2020 report and to seek clarification as to whether the government would continue to pursue the approach outlined in its AI White Paper.
  • The House of Commons Science, Innovation and Technology Committee published an interim report in August, based on input from its inquiry into the governance of AI.  The aim of the inquiry was to assess the government's approach to regulating AI in the context of what other countries are doing.  The government published its response in November.
  • The Culture, Media and Sport Committee published a report on AI and creative technology following its inquiry, Connected tech: smart or sinister?.  This is the second report arising from the inquiry.  Earlier in August, an initial report looked at the benefits and harms of connected devices, focusing particularly on data protection and cyber security. This second report considers the creative and entertainment potential of connected technology and, in particular, the impact of AI.
  • DSIT announced the launch of the AI and Digital Hub. This is a pilot advisory service to be run by the Digital Regulatory Cooperation Forum (the joint body comprising the ICO, Ofcom, the CMA and the FCA).  The scheme is intended to launch in 2024 and will allow businesses to seek tailored advice on how to meet regulatory requirements for digital technology and AI to help new products and innovations reach the market quickly and safely.  There will be an initial one year trial period of the scheme.
  • DSIT also published a number of case studies on AI assurance techniques to show how they can be applied to assist organisations in demonstrating compliance with the principles set out in the AI White Paper.
  • The CMA published a report summarising its initial review of competition and consumer protection issues in the development and use of AI foundation models.
  • Lord Holmes of Richmond introduced the Artificial Intelligence (Regulation) Bill to the House of Lords in November.  As a Private Members' Bill, it has little chance of being adopted, however among other things, it proposes creating a central AI Authority to oversee regulation of AI and carry out various monitoring and oversight functions.
  • In November, the Prime Minister announced the world's first AI Safety Institute to advance knowledge of AI safety, evaluate and test new AI and explore a range of risks.  In his speech, the Prime Minister also re-iterated the UK's approach to regulating AI set out in its AI White Paper.  DSIT published a discussion paper to support the summit and a report evaluating the six-month pilot of the UK's AI Standards Hub.  In addition, leading frontier AI firms responded to the government's request to outline their safety policies.
  • In December, the UK government published draft AI Skills for Business guidance for consultation.
  • The UK's National Cyber Security Council (NCSC) announced new voluntary global guidelines on secure AI system development in December. The guidelines were developed in association with the US and industry and have been endorsed by national agencies from 16 other countries including the G7.
  • On 8 December, the CMA published an invitation to comment on the partnership between Microsoft and OpenAI. The CMA is considering whether or not a relevant merger situation exists. 

EU

The EU has taken a different approach to the UK when it comes to regulating AI, intending to introduce top-down legislation to place risk-based requirements on development and use of AI and progressing its AI Liability Directive. 

Provisional political agreement was reached on the AI Act (AIA) on 8 December 2023 after lengthy and intense negotiations between the European Parliament, the Council of the EU and the European Commission.

The legislation continues to take a tiered approach with a ban on certain types of 'unacceptable risk' AI, and the most onerous provisions relating to high-risk systems.  Press releases (EP, Council), and the press conference highlight the following, however, the compromise text has not yet been completed and without seeing the final drafting, we do not have full details:

Definition of AI

This follows the OECD definition to ensure international consistency.

Scope and exemptions

The AIA will not apply to areas outside the scope of EU law and will not apply to systems used exclusively for military or defence purposes, to AI systems used solely for research and innovation, nor to people using AI for non-professional purposes.  Protections have been included to prevent misuse for national security purposes.  Only completed systems and models will be covered - systems under development will not be in scope. It is unclear whether existing systems will be impacted.  Open source and free models are excluded from the majority of the Act but may still be subject to bans (where relevant), and are subject to transparency obligations around training data, as well as a requirement to comply with copyright law.

Banned AI

The following will be banned:

  • Biometric categorisation systems to infer sensitive data or which use sensitive data
  • Untargeted scaping of the internet and CCTV systems to create facial recognition databases
  • Emotional recognition in the workplace and educational institutions
  • Social scoring based on social behaviour or personal characteristics
  • AI systems which manipulate human behaviour to circumvent free will or which exploit people's vulnerabilities
  • Predictive policing for individuals (in some cases).

Law enforcement exemptions

Safeguards have been included to protect fundamental rights.  Biometric identification systems (RBI) in public spaces for law enforcement purposes will be permitted subject to prior judicial authorisation and for a strictly limited list of defined use cases.  Post-remote RBI will be permitted in the targeted search of a person convicted or suspected of having committed a serious crime.  Real-time RBI use will be subject to strict limitations including time and location, for the purposes of targeted searches of victims of certain crimes, prevention of a specific and present terrorist threat, or to locate or identify a person suspected of having committed stipulated serious crimes (including terrorism, trafficking, murder, and environmental crime).  An emergency procedure has been included to allow law enforcement agencies to deploy a high-risk AI tool which has not passed a conformity assessment in case of urgency.

High-risk systems

Systems which pose a risk of significant harm to health, safety, fundamental rights, the environment, democracy (including systems used to influence the outcome of elections), and the rule of law, will be classified as high-risk.  A mandatory fundamental rights assessment will be required before high-risk systems are put on the market.  Deployers also have increased transparency requirements and public bodies which use them will be required to register their use in an EU database.  Individuals will have the right to make complaints about AI systems and receive explanations about decisions made by high-risk systems which impact their rights and must be informed when they are exposed to an emotion recognition system.

General-purpose AI systems (GPAI) and foundation models

New provisions have been added to cover GPAI (AI systems which can be used for many different purposes) and for the foundation models they are based on.  Transparency requirements will apply.  These include drawing up technical documentation, complying with EU copyright law, and publishing detailed summaries of the content used to train the models.  High-impact GPAI models (foundation models trained with large amounts of data and with advanced complexity, capabilities and performance which can disseminate systemic risks) will be subject to additional requirements including risk evaluation and mitigation obligations.  They will also be subject to adversarial testing requirements, cyber security rules, incident reporting requirements, and will be required to report on their energy efficiency.  Until such time as harmonised standards are published, codes of practice can be relied upon.

Limited-risk systems

These will be subject to light transparency obligations, for example, a requirement to disclose that content is AI-generated.

Governance and oversight

An AI Office will be created within the Commission to oversee the most advanced AI models, contribute to fostering standards and testing practices, and enforce EU-wide rules.  The Office will be advised by a scientific panel of independent experts, in particular about GPAI systems, foundation models, and material safety risks.  The AI Board, comprised of Member State representatives, will provide a coordination function and act as an advisory body to the Commission.  Member States will be given a role in managing implementation, including the design of codes of practice for foundation models.  An advisory forum of stakeholders including industry representatives, SMEs, civil society and academia, will be set up to provide technical expertise to the Board.

Penalties

Non-compliance can result in fines of up to the higher of:

  • €35m or 7% of annual global turnover for violations relating to banned AI applications
  • €15m or 3% for violations of AI Act obligations (more proportionate caps are set for SMEs and start-ups)
  • €7.5m or 1% for supplying incorrect information

Complaints may be made by individuals and entities to the relevant market surveillance authority concerning non-compliance.

Measures to support innovation and SMEs

Clarifications have been added around allocation of responsibilities in the supply chain, and, in particular, between providers and users of AI systems.  Clarifications on the interaction with other existing laws have also been made.  Measures in support of innovation and around AI regulatory sandboxes and real-world testing have been included, as well as measures to support innovation and SMEs.

Next steps and timeline

The AI Office will be set up immediately.  Work will now continue to finalise details and a compromise text will be submitted to Member State representatives (Coreper) for endorsement.  The text will need to be confirmed by the European Parliament and Council and will then be subject to legal-linguistic revision before progressing to the final stages of adoption.  The legislation is expected to come into force in Spring 2024.  There will be a two-year implementation period, however banning provisions are expected to come in within six months, and transparency and governance rules to apply after twelve months.

AI safety summit

The first international AI Safety Summit hosted by the UK at Bletchley Park took place on 1-2 November 2023.  Key developments arising from the summit include:

  • the signing of the 'Bletchley Park Declaration' in which representatives of 28 governments including the UK, US and China, plus the EU, committed to working together on shared AI safety standards, particularly in relation to frontier AI (advanced AI systems)
  • a non-binding agreement between 11 of the attendee countries including the EU, US, UK Japan and Australia (but not China) and eight leading AI companies including ChatGPT, OpenAI, Microsoft, Mega, Google and Amazon, to allow regulators to review their products before placing them on the market and to collaborate on pre-and post-launch safety testing
  • support for an international expert body on AI
  • the announcement of a report on the state of AI science, to be written by a group of leading academics led by Yoshua Bengio, and supported by an advisory panel comprising representatives of attendee countries
  • a commitment to further summits with the next one to be hosted in France.
  • Read more here.

Some other notable developments

  • As part of the Atlantic Declaration: a framework for a twenty-first century US-UK Economic Partnership announced on 8 June 2023, the US and UK agreed to accelerate cooperation regarding "safe and responsible development" of new technologies including AI
  • China launched its Interim Measures for the Administration of Generative Artificial Intelligence Services on July 10, 2023.  Read more here.
  • In July 2023, The White House announced that representatives from Amazon, Anthropic, Google, Inflection, Meta, Microsoft and OpenAI, had agreed to a number of principles for developing AI based on safety, security and public trust.  Microsoft and Google subsequently announced they would offer to indemnify commercial users of their generative AI products for specified types of liability
  • The G7 leaders agreed International Guiding Principles for all actors in the AI ecosystem and an International Code of Conduct for developers of advanced AI systems as part of the Hiroshima AI process 
  • The UN announced the launch of a high-level advisory body on AI. This is a multi-stakeholder body intended to undertake analysis and make recommendations for international governance of AI
  • President Biden issued an Executive Order on safe, secure and trustworthy AI (EO) in November. Among other things, the EO requires developers of the most powerful AI systems to share their safety test results and other critical information with the US government. Vice President Harris subsequently announced a range of commitments and policy developments at the AI Safety Summit, including the establishment of an AI Safety Institute intended to operationalise NIST's AI risk management framework, creating guidelines, tools, benchmarks and best practice recommendations to identify and mitigate AI risk.  It will also enable information sharing and research, including with the UK's planned AI Safety Institute.  The VP also announced draft policy guidance on US government use of AI, and the US made a political declaration on the responsible military use of AI and autonomy
  • The Partnership on AI consulted on its guidelines on safe foundation model deployment
  • The Global Privacy Assembly resolution (sponsored by the EDPS) on generative AI systems, commits to ensuring the application of enforcement of data protection and privacy legislation in the context of generative AI, working cooperatively, and encouraging stakeholders to take privacy and data protection into account when developing and using generative AI systems
  • Germany announced an AI action plan to boost investment in Germany but also focused on collaboration with its EU partners.

Digital Health and AI

The World Health Organisation  published guidance on Regulatory considerations on artificial intelligence for health.  This aims to outline key principles which governments and regulatory authorities can follow to develop new guidance or adapt existing guidance on AI.

In October, the UK's MHRA together with the FDA and Health Canada, published guiding principles for using predetermined change control plans for machine learning-enabled medical devices.  The aim of the principles is to support manufacturers of these devices in reducing the regulatory burden of reassessment following changes and updates to their devices by using predetermined change control plans underpinned by the principles.  

The MHRA also announced plans to launch an AI regulatory sandbox for developers in 2024, to be known as the AI-Airlock.  This is intended to facilitate the development of medical software and medical devices that use AI and to expedite the availability of this technology for NHS patients.

Read our predictions for digital health in 2024 here.

Technology

Automated vehicles

The government and Ofgem published the Electric Vehicle Smart Charging Action Plan in January 2023. This sets out the steps the government and Ofgem will take to deliver energy flexibility form EV charging, providing affordable, green power. The government announced various grants to businesses to help them assist with these goals.

In February, the Law Commission published advice to the government on issues arising from remote driving (wireless connectivity which allows a person outside a vehicle to control it on a public road). The Commission recommended short-term changes to the law including prohibiting remote driving under the Road Vehicles (Construction and Use) Regulations 1986, and allowance for conditions or restrictions to be placed on specific types of vehicle. It concluded that remote driving raises issues including around connectivity failings leading to safety issues, and cyber security vulnerabilities.  

As announced in the King's Speech, the UK government introduced the Automated Vehicles Bill to Parliament on 8 November 2023.  The Bill sets out a legal framework for automated vehicles (AVs) including by:

  • Setting enforceable safety thresholds for self-driving vehicles
  • Making companies liable for the way their self-driving vehicles behave on the road and protecting users from being unfairly held accountable
  • Introducing an authorisation system which will also identify the organisation responsible for the vehicle using the automated technology – likely to be the manufacturer
  • Requiring authorised self-driving systems to be overseen by a licensed operator when the system operates without a responsible person inside the vehicle
  • Requiring a permit for AV passenger services or a licence under existing PHV laws
  • Prohibiting misleading market practices including the use of ambiguous terminology relating to whether vehicles are classified as self-driving
  • Allowing for secondary legislation to set out specific terminology and symbols to be used in marketing self-driving vehicles
  • Requiring traffic regulation orders to be made available digitally in a common format for use in self-driving vehicles and other systems that facilitate driving vehicles on a road.  This data can then be used to create a digital map to help support the safe operation of AVs
  • Creating new criminal offences in relation to non-compliance.

Internet of things

See section on Data and cyber security for information about the Product Security and Telecommunications Infrastructure Act and related secondary legislation as well as the EU's Cyber Resilience Act. You can also see a detailed update on IoT here.

Cryptoassets and other digital assets

In June, the Law Commission published its final report setting out recommendations on changes to the law to recognise and better protect digital assets.  In line with its interim report, the Law Commission concluded that the common law system in England and Wales is well-placed to accommodate existing and new types of digital asset, but recommended two types of statutory reform:

  • Legislation to confirm recognition of 'digital objects' as a third category of property – this will cover cryptoassets and crypto-tokens among other types of digital asset
  • Legislation to put in place a bespoke statutory framework that better and more clearly facilitates the entering into, operation and enforcement of certain crypto-token and cryptoasset collateral arrangements (eg securitisation).
  • The Commission further recommended the government set up a panel of experts to provide guidance on technical and legal issues relating to digital assets.

    In October, The House of Commons Culture, Media and Sport House Committee published a report on: NFTs and the Blockchain: the risks to sport and culture. The report sets out the conclusions and recommendations from the Committee's inquiry into the subject.  

    In November, HM Treasury has published a response (discussed here) to its February 2023 consultation (discussed here) on the future regulatory regime for cryptoassets.  This confirms the intention to bring the financial services regulation of cryptoassets within the scope of the Financial Services and Markets Act 2000 (FSMA).  The list of specified investments under the FSMA (Regulated Activities) Order 2001, will be expanded and firms undertaking relevant activities involving cryptoassets as part of their business activities will need to be authorised by the FCA.  It also sets out policy on issues raised in the call for evidence relating to decentralised finance (DeFi) and on the regulation of other cryptoasset activities including investment advice and staking, and sustainability.  Secondary legislation is anticipated in 2024. The UK Jurisdiction Taskforce also consulted on digital assets and English insolvency law as discussed here.

    Digital IDs

    In February, the EC has published its European Digital Identity Wallet Architecture and Reference Framework. It provides a set of specifications needed to develop an interoperable European Digital Identity (EUDI) Wallet Solution, based on common standards and practices.  

    In March, the OECD published draft recommendations on digital ID implementation and governance. They outline principles based on three pillars:

  • Developing user-centred and inclusive digital identity systems
  • Strengthening the governance of digital identity
  • Enabling cross-border use of digital identity.

Some other interesting developments:

  • From 20 January 2023, Ofcom has been issuing licences to businesses operating drones for commercial services including where the drone travels beyond the sight of the operator. The licence covers drones operated in the UK and territorial waters, lasts indefinitely (subject to payment of a £75 annual fee), authorises a specific range of audio equipment, and permits the use of satellite and mobile technologies. Operators also need to comply with air safety regulations.  
  • In February, the European Commission launched a blockchain regulatory sandbox to run from 2023-26. It will support 20 projects annually, providing controlled environments for product and service testing with input from relevant regulators
  • In March, the EC published a call for evidence on metaverses and how to achieve interoperable and innovative virtual worlds that can be used safely and with confidence by the public and businesses 
  • The UK government published its National Quantum Strategy in March. This sets out a ten-year plan of objectives and priority actions, and looks at the UK's strengths and challenges it faces. In November, the government published its national quantum strategy missions.  The Parliamentary Commons Science and Technology Committee launched an inquiry into the state of the UK's quantum industry and its implications
  • The European Commission adopted a Communication in July, setting out a strategy and proposing actions on virtual worlds and Web 4.0 
  • In March 2023, the UK published a new International Technology Strategy intended as a roadmap for reaching 'tech superpower status' by 2030. It sets out ten priority actions which are largely broad ambitions rather than concrete steps. It also sets out priority principles (open, responsible, secure, resilient), six strategic priorities, and priority technologies (including AI and quantum, all as underpinned by data).  
  • The government made a commencement order on 11 July to bring in provisions under ss19-21 the Digital Economy Act which relate to domain name registries.  These give various powers to the Secretary of State, including to intervene in the operation of internet domain name registries under certain circumstances where particular failures or issues have been detected.   Later that month, DIST consulted on the design of regulations in relation to UK-related domain name registries, which enactment of the DEA provisions will allow to be made.  The consultation asked for views on the abuse of relevant domain names to ensure procedures remain in place to deal with both misuses and unfair uses of domain names. 
  • In October, the government further changed and extended the implementation of the voluntary code of practice for app store operators and developers. The eight principles set out in the code should now be implemented by June 2024, after which, DSIT will assess engagement and make policy recommendations.  DSIT also conducted a survey to help it understand current security and privacy practices used by app developers. 
  • The CMA announced a market investigation into the provision of cloud services in the UK in October 2023, following Ofcom's final report on its market study.  Ofcom's final reports confirmed its preliminary findings – that the main suppliers of cloud services in the UK, Amazon Web Services and Microsoft, have a combined market share of 70-80%.  The market is not working well and certain features including egress (exit) fees and interoperability issues, create barriers to switching.  The CMA is opening a market investigation with a deadline of 4 April 2025.  It will use until May 2024 to gather information.  It will publish working papers and disclose thinking to relevant parties between March and June 2024.  Provisional decisions will be published for consultation in September or October 2024 with the final report to be published between February 2025 and the final deadline of 4 April 2025.
Data and cyber security
Consumer protection and product liability

The UK's Digital Markets Competition and Consumers Bill was introduced in April 2023 to reform the UK's competition and consumer protection regimes.  The Bill is intended to tackle a wide variety of consumer protection issues but looks, in particular, at subscription traps, and provides for secondary legislation to deal with fake online reviews.  These are both issues which have featured in enforcement action and litigation in the UK and the EU this year.

Digital Markets Competition and Consumers Bill

The government published its Digital Markets, Competition and Consumers

(DMCC) Bill in May 2023.

  The DMCC Bill:

  • Sets out a regulatory framework for tackling competition issues in big tech, including by placing the Digital Markets Unit on a statutory footing with new powers to regulate digital businesses with "strategic market status", and giving the DMU new enforcement powers
  • Reforms the UK's competition regime more widely
  • Provides powers to introduce secondary legislation, particularly with the aim of introducing protections against fake online reviews
  • Introduces measures to protect consumers from subscription traps
  • Reforms protections for consumers from unfair commercial practices (revoking the CPUT Regulations
  • Enhances the CMA's enforcement powers, including by allowing it to enforce consumer rights directly and impose significant penalties for breaching consumer protection laws.

The DMCC Bill was re-introduced to Parliament on 8 November as announced in the November 2023 King's Speech. It contains amendments including minor ones to the onerous subscription information provisions and is now progressing through House of Lords. You can read more about the DMCC Bill here.

In September, The government published a 'Consultation on Improving Price Transparency and Product Information for Consumers' and a report on 'Estimating the prevalence and impact of online drip pricing'The consultation set out proposed wording for a ban on fake online reviews which would be added to the list of automatically unfair commercial practices in the DMCC Bill.  It also looked at: 

  • display of pricing information
  • hidden fees and drip pricing
  • how professional diligence requirements should be interpreted for online platforms and whether the term should be redefined with the aim of ensuring online platforms and consumers have greater clarity over their respective rights and responsibilities
  • whether enforcers other than the CMA should be able to apply to court for an online interface order.  

In addition, there are questions on the suitability of the current list of 31 automatically unfair commercial practices which are intended to be imported into the DMCC from the outgoing CPUT Regulations. See here for more.

Fake reviews  and subscription traps

The ASA  published high level guidance in March on Avoiding fake reviews – a guide to testimonials and endorsements. It consists of seven top tips to help ensure quotes from clients used to endorse products comply with the CAP Code. Claims have to be accurate, capable of substantiation and unlikely to mislead. 

At the end of March, the ASA published an enforcement notice to help tackle the issue of subscription traps in digital advertising. The ASA began targeted enforcement action against non-compliant advertisers from 27 April 2023. 

The ECJ ruled in a reference form Austria that the 14 day withdrawal right under the Consumer Rights Directive, applies only from the date of entering into a free trial, and not when the free trial converts to a paid subscription or when it auto-renews.  This is provided that the consumer has received clear pre-contractual information about price and when payments become due.  The contractual terms do not change so there is no new right to withdraw when the contract moves to the paid-for period.  If the consumer has not been properly informed, there will be a new right of withdrawal beginning at the end of the free trial period. This is obviously an EU decision but it concerns the Consumer Rights Directive, implemented in the UK as the Consumer Contracts Regulations.  In the UK, the DMCC Bill will introduce enhanced withdrawal rights for consumers for subscription contracts. 

Consumer terms and conditions

In January, Google made a series of commitments to change some of its commercial and contractual practices to align with EU consumer protection rules. Impacted services are Google Store, Google Play Store, Google Flights and Google Hotels. The majority of these relate to the provision of pre-contractual information and clearer information and transparency around consumer protection rights (eg of withdrawal). They also focus on improved compliance with geo-blocking rules, and a commitment to create a specific email address for consumer protection authorities to use to request quick removal of illegal content. 

On 6 March 2023, the European Commission announced that WhatsApp had committed to providing greater transparency on future changes to its Terms of Service (ToS) including:  

  • explaining what changes it intends to make and how they could affect user's rights
  • ensuring there is an option to reject updates that is as prominent as the option to accept the updates
  • ensuring that notifications about updates can be dismissed or the review of updates delayed
  • respect users' choices and not sending recurring notifications about ToS updates.

The announcement follows a complaint by the Bureau Européen des Unions de Consommateurs, alleging that the 2021 update to WhatsApp's ToS was not clear or transparent, and WhatsApp unduly pressurised its users to accept them, in breach of the Unfair Commercial Practices Directive (2005/29/EC). The Consumer Protection Cooperation Network, which includes consumer protection authorities within the EU, will monitor compliance with these commitments and take enforcement action if necessary. 

The Bureau Européen des Unions des Consommateurs (BEUC), filed a complaint with the network of consumer protection authorities (CPC) in November, alleging that Meta's new model being rolled out in the EU for Facebook and Instagram, engages unfair commercial practices.  In response to issues around lawful basis for using tracking technologies, Meta is rolling out an option to choose an ad-free, paid-for version of its Facebook and Instagram platforms as an alternative to consenting to receiving targeted advertising.  BEUC claims that aspects of the new model use unfair commercial practices including allegations that:

  • Meta is pushing consumers into making a choice they may not want to take by partially blocking its services until users have selected an option, creating a sense of urgency and pressure which is an aggressive practice
  • Consumers cannot make an informed choice due to misleading and incomplete information including the suggestion that the non-subscription model is free when consumers actually pay via their data.  It's also unclear that there will still be processing of personal data with the subscription model even though it won't be used for targeted advertising
  • Meta inaccurately presents this new model as a choice.  The high subscription fee is a deterrent to choosing that model and given the market power of Facebook and Instagram, users are reluctant to leave the services.  This means they are likely to be pushed into accepting the free model with targeted advertising and do not have a real choice.

EC adopts package to review current ADR framework

The EC adopted a package of measures to review the EU's current alternative dispute resolution (ADR) framework in October 2023.  This includes proposals for: 

  • A Directive to amend the Directive on ADR for consumer disputes and three other Directives in order to simplify current rules and make them more suited to modern consumer marketplaces, in particular, to digital ones.
  • A Regulation repealing the ODR Regulation (and related legislation) in order to discontinue the European ODR platform which is not being used enough to justify its cost.
  • A Recommendation on quality requirements for dispute resolution procedures offered by online marketplaces and EU trade associations.
  • A report on application of the ADR Directive and the ODR Regulation.

Product liability

Both the EU and the UK have made considerable strides in their attempts to reform their product liability regimes over the course of 2023.

On 22 March 2023, the European Commission adopted a proposal for a Directive on common rules promoting the repair of goods (Repair Directive) The proposal lays down common rules for the promotion of the repair of goods in the event of a defect, with the aim of avoiding their replacement and premature disposal. Among other things, the initiative would support the objectives of the European Green Deal by reducing waste. Read more here.

The EU General Product Safety Regulation came into force on 12 June 2023, and will apply from 13 December 2024. The GPSR revokes and replaces the General Product Safety Directive. It sets out revised rules to ensure safety of products placed on the EU market, both on and offline unless specific safety rules apply. Products placed on the market before the GPSR applies which conform with the outgoing Directive will not be affected. See here for more. Trilogues on the EU's draft Product Liability Directive which was published in September 2022 are expected to finish in mid-December. 

The EU's Regulation on machinery products was published in the Official Journal at the end of June 2023.  It repeals the Machinery Directive 2006 and creates new rules more suitable for emerging technologies including AI. The Regulation will apply in total from 14 January 2027, with some provisions coming in sooner.  The aim of the legislation is to increase user safety and improve predictability for the machinery industry, especially SMEs. 

The UK government has also been looking to reform its product safety regime and began consulting on proposals in April.  The main focus is on reducing the burden on business while maintaining high standards of consumer protection.  Proposals look more at types of risk rather than types of product and place an emphasis on tailored guidance, industry-led standards and risk assessment. The UK government also decided to recognise the CE marking indefinitely in August.

The Office of Product Safety and Standards' consultation (Smarter Regulation: UK Product Safety Review) which was launched on 2 August 2023, also forms part of the government's Smarter Regulation strategy. The consultation outlines thirteen proposals in relation to three core aspects of the proposed new product safety framework: bringing products to the market; online supply chains; and compliance and enforcement. Read more here.

Meanwhile, the EU's draft AI Liability Directive continues to progress, and see also the section on Technology for an update on the UK's Automated Vehicles Bill.

Games and eGaming

In a busy year for the games and gambling sectors, notable developments included the acquisition of Activision Blizzard by Microsoft, and the CMA's attempt to carry out a market investigation into mobile browsers and cloud gaming focusing on Apple and Google. The government's long-awaited White Paper on reforming the Gambling Act 2005 was published in April this year and the Gambling Commission continued to enforce against businesses breaching AML and advertising regulations.  Protection of online users, particularly children and other vulnerable individuals, has been an issue across games and gambling (including where personal data is concerned) and this will only continue to grow in significance next year with the introduction of the UK's Online Safety Act.

Here we look at some of the key regulatory developments in 2023.  For a full analysis of the year's events in the games industry as well as predictions for the year ahead, see our article here. Some highlights from this year's news include:

Games

  • At the end of 2022, the CMA published its issues statement, following its decision on 22 November 2022 to make an "ordinary" market investigation reference, in respect of the supply of mobile browsers and mobile browser engines in the UK and also the distribution of cloud gaming services through app stores on mobile devices in the UK. This followed a market study report on 10 June 2022 into the supply of mobile ecosystems in the UK, which found that Apple and Google have an effective duopoly on mobile ecosystems. Apple appealed to the CAT to review the decision claiming is was ultra vires and did not comply with procedural requirements. It asked the CAT to quash the CMA's decision and stay the market investigation pending the outcome of the appeal. The CAT upheld Apple's appeal but this decision was itself overturned by the Court of Appeal in December. The CMA's action remains paused pending a decision on any appeal to the Supreme Court by Apple.
  • Epic Games, the maker of Fortnite, agreed with the US Federal Trade Commission, to pay $520m to settle claims relating to alleged privacy violations and unwanted charges.Epic will pay $275m for violating COPPA (the Children's Online Privacy Protection Act). This is the largest ever penalty for violating an FTC rule. A further $245m will be paid to refund customers for the use of dark patterns and deceptive interfaces, including for billing.
  • In September 2023, the General Court of the European Union dismissed an appeal by Valve and five PC video game publishers against the decision that they had breached Article 101 TFEU by engaging in anti-competitive geo-blocking practices.  The Court found the Commission had established the requisite legal standard of an agreement or concerted practice between Valve and the five games publishers to restrict parallel imports through geo-blocking of keys enabling activation and, in certain cases, the use of the games at issue on the Steam platform.  It found the geo-blocking did not pursue the objective of protecting copyright but was used to eliminate parallel imports of those video games and protect margins earned by Valve, and the high royalties collected by the publishers.
  • Following a lengthy process of negotiation with the CMA, Microsoft was able to complete its acquisition of Activision Blizzard in October 2023.  CMA approval was the final piece of the puzzle following approval of the deal by the EC, and a Microsoft victory against the US Federal Trade Commission.  The CMA secured a number of concessions from Microsoft to address its concerns, however, the CMA was highly critical of Microsoft's approach, saying it had dragged out proceedings, wasting time and money because it declined to restructure the deal during the CMA's original investigation. The FTC is continuing to challenge the deal.

eGaming

Gambling Act White Paper

The government published its long-promised White Paper on its plans to reform the Gambling Act 2005 in April 2023. Proposals include:

  • a review of online game design rules to look at limiting speed of play and other characteristics which exacerbate risk
  • a mandatory levy on betting firms to pay for treatment of addiction
  • new player protection checks and stake limits for online slots (between £2-£15 per spin) and a consultation on measures to give greater protections for vulnerable 18-24 year olds
  • frictionless player protection checks
  • additional powers for the Gambling Commission to enable it to tackle black market operators through court orders and work with ISPs to take down and block illegal online gambling sites
  • rules to prevent bonus offers harming vulnerable people
  • closing loopholes to ensure under-18s cannot gamble either online or via cash fruit machines, and bringing football pools betting in line with National Lottery play for over-18s only
  • a new industry ombudsperson to deal with disputes and rules on redress where a customer suffers loss due to an operator failing in player protection duties
  • a review of the horserace betting levy.

Read more here.

In August, the government published its first set of consultations to implement proposals in its Gambling Act Review White Paper.  The package included consultations on:

  • financial risk and vulnerability checks
  • proposals to reduce the speed and intensity of online products
  • improving consumer choice on direct marketing by allowing an opt-in to marketing on specific product types and specification of desired marketing channels
  • strengthening on-premises age verification.

Further consultations cover online slots stake limits and updated rules for casinos, bingo halls and arcades. The consultations were open for twelve weeks with the findings yet to be published.

DCMS activity

A DCMS Committee was set up to examine the government's approach to the regulation of gambling.

DCMS in collaboration with academia, research councils and the games industry, published a best-practice Video Games Research Framework to support researchers in building a stronger evidence base for future policy making. The Framework is intended to encourage more analysis on video games and related technology, identify priority topics for research, including the economic benefits of games, their role in education and potential impacts on player wellbeing. It sets out priority areas including why people interact with games, their impact on physical and mental health, and the effect of in-game features like spending and advertising on player experience. It also provides advice and information on data sharing, including when the data is personal.

ICO support for data sharing between gambling operators following completion of regulatory Sandbox

The UK's ICO published a report in June 2023, following the exit of the Betting and Gaming Council from the ICO's regulatory Sandbox.  The Council entered the Sandbox to explore the gambling industry's development and trial of a Single Customer View (SCV) solution, developed with operators and intended to enable a more unified and proactive intervention by gambling operators to reduce incidents of gambling related harm. The data sharing project (known as GamProtect) will now be implemented across the gambling industry with support from the Betting and Gaming Council. 

The ICO also wrote to UK Finance sharing its findings and responding to a request for clarification on whether in relation to the sharing of consumer credit risk data by credit reference agencies with gambling operators:

  • The processing would be deemed compatible with the original processing purposes 
  • The nature of lenders' transparency obligations and whether they have an obligation to notify their consumers of the new sharing by the credit reference agencies ahead of sharing the data  

The ICO also gave advice to gambling operators about sharing credit reference information.

Gambling Commission fines

The year kicked off with the Gambling Commission agreeing with Vivaro Limited (trading as vbet) a £337,631 settlement in relation to failings relating to anti money laundering and to promote safer gambling. Separately TonyBet was fined £442,750 for failing to have fair and transparent terms, and for failing to follow social responsibility and AML rules.

In Touch Games which operates 11 websites was fined £6.1m by the Gambling Commission after an investigation found social responsibility and money laundering failings. This was the third time it has been fined by the Commission. In 2019, it paid a £2.2m settlement for regulatory failures. This was followed by a £3.4m fine and warning in 2021.   At the start of September, its licences to operate in the UK were suspended.

This is a small representation of actions taken by the Gambling Commission during 2023.

High Court rules that Camelot's website terms covered technical hitch

The High Court ruled in April that an interim and optional animated screen saying a player of a Camelot 'instant win' game had won the top prize of £1m was not binding. Camelot claimed a coding issue had generated the error and the final screen display showed only a £10 win. Camelot argued that under its terms and conditions the contents of the animated screen were not relevant to the question of whether or not a player had won a prize.  The High Court agreed, finding the terms and conditions were properly incorporated into Camelot's online contract, were clear and not unusual or onerous, and made it clear that only the final screen was relevant.

See also the Advertising and Data and cyber security parts of this update for more.

Online safety

The UK's Online Safety Act

The UK's Online Safety Act (OSA) received Royal Assent on 26 October 2023.  Some of Ofcom's powers have come in immediately but most of the rest of the provisions will be brought into force under secondary legislation. Much of the detail around compliance under the OSA will be provided by codes of practice and guidance, and Ofcom has now published its consultation on protecting people from illegal harms online.

The OSA focuses on user-generated content (subject to limited exceptions) and applies to user-to-user services and search services as well as pornographic content services. The OSA regulates illegal content and certain specified types of harmful content, focusing especially on content harmful to children on services likely to be accessed by them. Terrorism and Child Sexual Exploitation and Abuse (CSEA) content are a particular focus, but a range of harmful content is also covered in specified circumstances. In relation to the most harmful type of content likely to be accessed by children, age verification/estimation must be used (subject to a limited exception).

The OSA applies to services which have links to the UK. Various safety duties apply to different categories of content. In order to establish what services need to do, they have to carry out a variety of risk assessments against Ofcom risk profiles. Service providers also have transparency requirements and obligations to provide redress and are likely to have to amend their terms and conditions. There are wider duties to protect freedom of expression and the right to privacy including personal data. 

Category 1 services (to be determined, but likely to be the larger social media services) and Category 2A and 2B services have additional duties.  In particular, Category 1 services have expanded duties to protect fundamental freedoms including content of democratic importance and news publisher content.  They also have to comply with adult user empowerment provisions which require them to give adults users options to prevent them encountering certain types of harmful content.

Ofcom is the regulator of the OSA.  It has extensive powers and duties. It is responsible for producing initial risk profiles and a raft of codes of practice and guidance which will inform how service providers are supposed to comply with the OSA, as well as a range of reports on the impact and operation of the OSA.  The process of introducing these (as set out in Ofcom's revised approach to implementing the OSA) is likely to take at least three years with everything subject to consultation and much of it dependent on the introduction of secondary legislation. 

In addition to publishing its consultation on illegal harms (discussed here), Ofcom also published a call for inputs on what good media literacy by design looks like for social media, search, video sharing and gaming services in November and a consultation on guidance for service provides publishing pornographic content, and the government has published a consultation on super-complaints under the OSA.  The next Ofcom consultations will focus on child safety and protecting women and girls, and pornography content.

Ultimately, Ofcom will have a range of enforcement powers including the ability to fine organisations the higher of up to £18m or 10% of global annual turnover.  The OSA is very wide-ranging and Ofcom estimates that around 100,000 online services could be in scope.

Read more about the OSA here.

Voluntary Online Fraud Charter between UK government and leading tech businesses

The UK government published an Online Fraud Charter in December. This is a voluntary agreement between the government and tech platforms and services to try and reduce fraud.  Signatories which include Amazon, TikTok, Microsoft, X, Google, YouTube, Facebook, Instagram and Snapchat, agree to adopt a range of measures within six months including blocking, reporting and takedown, intelligence sharing, protection against fraudulent ads, transparency, intelligence sharing and cooperation with law enforcement.

The EU's Digital Services Act

The majority of the Digital Services Act (DSA) has applied to very large online platforms (VLOPs) and very large search engines (VLSEs) since August 2023, and will apply to other companies from 17 February 2024.

The European Commission published Q&A non-binding guidance at the end of January to help VLOPs and VLSEs publish their average monthly active service recipients on their sites, as required for the purposes of their designation under the DSA.

The European Commission adopted an implementing Regulation in June on detailed arrangements for the conduct of certain proceedings by the Commission – namely investigation and enforcement activities relating to VLOPs and VLOSEs – under Article 83 of the DSA.   

A DSA Transparency database was launched by the Commission at the end of September. The Commission announced the opening of a European Centre for Algorithmic Transparency (ECAT). The ECAT will assess algorithms used by VLOPs and VLOSEs under the DSA. This will include looking at transparency reports and risk self-assessments submitted by the relevant companies, to assess whether they comply with requirements under the DSA, and carrying out inspections into the systems using the algorithms where required to do so by the Commission.

The EC adopted a Delegated Regulation in October, with rules on independent audits to assess compliance of VLOPs and VLOSEs with the DSA. It sets out the steps designated services must apply to verify the capabilities and independence of their auditor, and the main principles auditors should apply when performing DSA audits. The first audits will be published in August 2024.

Amazon and German fashion company Zalando are challenging their designations before the ECJ amidst a lack of clarity under the DSA.  On 27 September 2023, the European General Court delivered an interim decision in the legal dispute between Amazon and the European Commission, agreeing to pause the implementation of DSA obligations related to creating and disclosing an advertising repository and offering users a non-profiling-based option for each recommendation system until the Court decides whether Amazon is a VLOP.

Meanwhile German consumer protection organisations are reportedly already considering bringing class actions with respect to non-compliance with the DSA.  Questions have also been raised about the Commission's readiness – it has to sign a number of cooperation agreements by 24 February 2024, when compliance by smaller companies will begin.  

The first information requests under the DSA were sent out by the Commission in October 2023. The EC sent X (formerly Twitter) a request for information to assess its compliance, particularly with regard to its policies and actions regarding notices on illegal content, complaint handling, risk assessment and risk mitigation measures.  X was required to provide requested information on the activation and functioning of its crisis response protocol by 18 October, and information relating to other questions by 30 October.  The Commission will consider its response which could include opening formal proceedings or imposing fines for incorrect, incomplete or misleading information. The EC has also requested information on a variety of issues from TikTok, Snap and Meta.

In December, the DC launched its Digital Services Terms and Conditions database: a database consisting of 790 sets of terms and conditions from more than 400 services provided by over 290 distinct service providers including Apple, Meta and Microsoft. The entries include a variety of documents like Commercial Terms, Developer Terms, Live Policy, Terms of Service, Privacy Policy, offering a comprehensive view of the digital legal landscape. The Commission says that this is just the beginning - through crowdsourcing with verified users, these numbers will continue to rise, ensuring an ever-expanding resource.

See here for more about the DSA.

Advertising

There have been clear areas of focus during 2023 from the UK government, the Advertising Standards Authority, Committee of Advertising Practice and the Competition and Markets Authority in relation to advertising in the UK, with many of those also attracting attention in the EU. Dark patterns, influencer advertising, competition issues, greenwashing, HFSS foods, gambling and advertising to children were areas of considerable regulatory activity. 

See our Interface article on what to expect in advertising in 2024.

Digital advertising

Alleged dominance of big tech in digital advertising was scrutinised by the CMA this year, with Google and Meta both offering commitments to allay CMA concerns which tended to focus on use of data.  Various class actions were also launched although the Competition Appeals Tribunal is yet to decide whether collective proceedings actions may be filed.  The European Commission also provisionally found that Google had abused a dominant position in digital advertising in June, concluding provisionally that only mandatory divestment of part of Google's services would provide sufficient remedy.

In February 2023, the European Commission sought feedback on the Unfair Commercial Practices Directive, the Consumer Rights Directive, and the Unfair Contract Terms Directive, as to whether they ensure a high level of protection in a digital environment.

In July, the UK government published plans to tackle illegal ads, influencer scams and protect children online under its Online Advertising Programme.  New legislation will be introduced and will apply to platforms, intermediaries and publishers (PIPs) in relation to some types of illegal paid-for advertising.  The plan is to fill in gaps where there is no applicable legislation.  The government has also formed a Ministerial-led taskforce to drive non-legislative action, working with industry to address illegal harms and protect children by improving the evidence base on the scale of the issue and impact of harms, and building on existing voluntary initiatives. The intention is that legislation will address paid-for content and that 'intermediaries' will be broadly defined and will include influencers. 

The HFSS regime will remain separate as will issues covered under the Online Safety Act and Consumer Contracts Regulations.  Care will be taken to avoid conflict with other legislation.  Advertisers are not included in the plans as they are already held to account under the existing self-regulatory regime and applicable backstops. Draft legislation will be proposed when Parliamentary time allows but there will also be further consultation on the detail.  

The ASA published the final report on its Intermediary and Platform Principles (IPP) pilot in October 2023.  The pilot involved the ASA working with ten of the largest digital advertising companies including TikTok, Google and Meta, to promote awareness of and compliance with online advertising standards.  The businesses signed up to principles to help make advertisers aware of the CAP Code and the ASA, and to remove non-compliant ads when advertisers failed to do so.  The report was mainly positive about the pilot.

The Taskforce published its action plan on 30 November 2023. The plan sets out how the Taskforce will work with the advertising industry, regulators and relevant government departments to understand and improve the evidence around the in-scope harms, and to identify ways to enhance voluntary initiatives or standards to tackle them, as well as develop new standards where gaps are identified.  DCMS will publish a further consultation on the OAP policy package and the ASA will publish case-study research.  DCMS will carry out research on online users' experience of advertising harms, and StopScams UK will explore how to build up and share evidence on scam adverts.  The Plan refers to joint-regulator and industry initiatives and incoming legislation, including the Online Safety Act and the DMCC Bill, and the role of Ofcom in particular.  It also refers to the Fraud Charter (see section on online safety) and the importance of sharing information.  There are plans to evaluate the Platform Principles Pilot to determine whether or not to formalise it into the ASA regulatory system, as well as proposals to raise existing industry standards and improve diligence, in particular around age assurance, influencer marketing and age-restricted product ads.

The EU's Digital Services Act requires online platforms to take steps to identify adverts, the advertiser and information about the parameters used to determine who the ad is presented to and how to change that. Very Large Online Platforms must also maintain a repository of this information for a period of one year after the ad was last presented.

Online Choice Architecture and dark patterns

In February 2023, the European Commission and 23 Member State consumer protection authorities published the results of a sweep of 399 online retail shops which looked at manipulative practices. The study focused on dark patterns including fake countdown timers, web interfaces designed to lead consumers to particular choices, and hidden information. 148 of the sites contained at least one of these three dark patterns.  The UK's ICO also published guidance on online choice architecture this year (see section on Data and cyber security)

In March, the CMA published an open letter to online businesses regarding the use of urgency and price reduction claims. The CMA urged businesses to review their online selling practices to ensure they are not unfair or misleading. Two days after publishing the open letter, the CMA launched an investigation into Wowcher to determine whether its group of companies has misled consumers by using countdown timers and other urgency claims to put unfair pressure on consumers to make purchases quickly.   In November, the CMA asked Wowcher to submit undertakings as to remedial measures in order to avoid court action.

In July, the CMA announced it had written to the Emma Sleep group detailing its concerns over harmful online sales practices, including countdown timers and discounts, which the CMA considers misleading. This followed an investigation into the group by the CMA which commenced in November last year.  As a result of its investigation, the CMA found evidence that discount claims made by Emma Sleep did not stack up against the actual savings made by customers. The CMA also had concerns that the firm’s use of countdown timers and claims of high demand for certain products could mislead consumers. The investigation is part of the CMA's wider programme to tackle online choice architecture.

The EU's Digital Services Act (DSA) requires online platforms to avoid using dark patterns.  They are prohibited from designing, operating or organising their interfaces "in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of recipients of their service to make free and informed decisions", and they must also publish the parameters used in their recommendation systems.  The UK's Online Safety Act, on the other hand, does not tackle this issue but contains safety duties for Category 1 services (to be determined) relating to fraudulent paid-for advertising.

Influencers

The CAP published updated influencer guidance in March 2023, which aims to make clear that "ads are ads".  In April, the ASA published its collated resources for influencer advertising, and the FCA published guidance on 'finfluencing' in July. 

In the EU, also in July, the European Economic and Social Committee published a report prepared at the behest of the Spanish Presidency of the EU recommending improvements to harmonisation on regulation of influencer marketing across the EU. 

Greenwashing

In July, CAP and BCAP updated their greenwashing guidance to take account of recent ASA decisions and CMA guidance.  The guidance was timely considering the issue of greenwashing has continued to attract CMA and ASA attention. 

In January 2023, concerned about the way that products are being marketed to customers as "eco-friendly" the CMA opened an investigation into ASOS, Boohoo and Asda, looking into green claims made about their fashion products. The CMA has not yet reached a view on whether consumer protection law has been breached.  On 26 January 2023, the CMA announced that it would be expanding its misleading green claims project to look at Fast Moving Consumer Goods (FMCG).

In November, CAP and BCAP updated their guidance on misleading green claims to cover green disposal claims. The guidance now clarifies the type of information to include to avoid misleading consumers.  The ASA will begin additional monitoring and enforcement work focusing on disposal claims based on established positions from January 2024.  From 1 April 2024, it will begin proactively investigating potentially problematic claims, focusing in particular on claims omitting material information and misleading or unsubstantiated claims.

You can read more about ASA decisions on green advertising here, and about regulation of green advertising across the EU, including its draft Green Claims Directive published in March, here. See here for more about the EU's proposals under its Green Package.

Cryptoassets

In June 2023, the UK's FCA announced new rules to ensure consumers have the appropriate knowledge and experience before investing in cryptos, and to ensure adverts are clear, fair and not misleading.

The FCA will be taking over the regulation of ads for qualifying cryptoassets (those that are transferrable and fungible), including cryptocurrencies and utility (fan) tokens from the ASA in non-broadcast media ads.  It will be introducing new rules which will apply to all firms marketing qualifying cryptoassets to UK consumers, regardless of where the firms are based.  NFTs are not impacted as they are non-fungible.  Cryptoassets have been classified as "Restricted Mass Market Investments" meaning they can be mass marketed to UK consumers subject to certain restrictions including that they must carry clear risk warnings, risk summaries and not include incentives to invest.  There is also an overarching requirement that financial promotions must be fair, clear and not misleading. 

The ASA will continue to regulate the non-technical aspects of ads for FCA-regulated products under the CAP Code. 

Separately, the Bureau Européen des Unions de Consommateurs (BEUC)  filed a complaint with the European Commission and consumer authorities, against leading social media platforms alleging they are allowing misleading adverts and 'finfluencer' posts promoting cryptoassets on their platforms.

Gambling

In October 2022, the CAP and BCAP Codes were amended to prohibit adverts for gambling and lotteries likely to be of strong appeal to children or young persons, especially by reflecting or being associated with youth culture.  Guidance was issued on the new rules.  CAP published learnings from seven key ASA adjudications on the new rules in October. Key factors include social media following and whether or not a player or personality appearing in an ad is still active in their field. 

Amendments to the lotteries rules were made in July to make clear that the ban on people under 25 years of age appearing in a significant role in lottery ads includes those who appear to be under 25 (as well as those who are actually under 25). There are exceptions where the ad depicts a beneficiary of a lottery.  

Children

In July 2023 (alongside the announcement of the Online Advertising Programme), the ASA published: The 100 Children Report: Monitoring age-restricted ads served to children on social media and online.  The ASA found steps being taken by advertisers to prevent children being served with restricted ads included: 

  • Age restrictions
  • Negative key words
  • Audience profiling
  • Automated scripts to check appropriate measures are in place and suspend ads in case of any issues.

The ASA also noted steps taken by Meta and Google to prevent children from falsely representing themselves as adults, including preventing them changing their initially entered year of birth, and using AI models trained on behavioural data.   The ASA is not taking action against advertisers and noted the difficulties associated with age verification. 

HFSS, vaping, alcohol alternative products and body image

In January 2023, the government announced a further 21 month delay to the introduction of a 9pm watershed for TV adverts of HFSS products and a ban on paid-for online advertising of them. The new rules are now expected to come into force on 1 October 2025. In the meantime, the government is consulting on the draft secondary legislation, the Advertising (Less Healthy Food Definitions and Exemptions) Regulations 2022, which will implement the ban. The government is not intending to revisit policy but is seeking feedback on the text. 

In June, the government delayed the introduction of new rules restricting the promotion of multibuy deals on HFSS products for a further two years. The restrictions which were to have taken places in store and online, were initially intended to come into effect on 1 October 2022 and were then delayed for a year. The additional delay is attributed to not wanting to deprive the public of special offers for cheaper food and drink during the cost of living crisis. 

In July, Ofcom set out its approach to implementing the new prohibitions on online advertising for less healthy food and drink products on TV, on demand and online.  Ofcom confirmed it is designating the ASA as a co-regulator with Ofcom of paid-for online space and that it will also amend the BCAP Code to reflect the new restrictions. 

Read more about these developments here.

The government published a consultation on proposed youth vaping restrictions (which had already been the focus of ASA enforcement action earlier in the year) in October 2023.  These include banning the sale of tobacco products to anyone born on or after 1 January 2009, restrictions on the sale of disposable vapes, and banning the sale of non-nicotine vapes to under-18s. The consultation closed on 6 December 2023. 

In November, BCAP and CAP announced changes to their respective codes around marketing alcohol alternative products which will come into effect on 14 May 2024, and have published related guidance.  Alcohol alternatives are defined as products with an ABV at or below 0.5% which are marketed as alcohol alternatives.  The main concern is that marketing of these products must avoid the indirect promotion of alcohol.  Any advert or marketing material which fails to do this will be subject to the rules around promoting alcohol including around placement and scheduling.  It must be clear that products are alternatives to alcohol and marketing must not encourage or condone drinking alcohol in circumstances where it would be inappropriate or unsafe, or condone or promote heavy drinking more generally. Adverts for alcohol alternatives must not be targeted at under-18s and must not be likely to appeal to them (for example by association with youth culture or by featuring someone who appears to be under 25).

In November, CAP and BCAP published an update to their review of harms relating to digitally altered images in advertising and harms relating to body image which concluded that there is no need to introduce a requirement to label ads to indicate digitally altered body parts or proportions because doing so would not mitigate potential harms.  The update also set out action points for the coming year.

ASA and CAP guidance

In November, the ASA launched its new five-year strategy – AI-assisted collective ad regulation - which includes targets to make sure ads are responsible and people are protected from being misled, harm or offended by them.  The ASA said it had made significant progress in proactively regulating ads and is increasingly using AI to help it identify potentially problematic ads and support its compliance work.  Over the next five years it plans to invest more in its preventative and proactive work than on its reactive complaints casework.  It will focus on resolving investigations more quickly and on preventing irresponsible ads appearing in the first place.  Active ad monitoring using AI support will be scaled up significantly. 

In a busy year, some other key activity from the ASA and CAP includes:

Other UK and EU legislative developments in and related to the commercial tech and data sectors include:

UK

REUL Act and related secondary legislation

The Retained EU Law (Revocation and Reform) Act received Royal Assent in July 2023.  It was significantly reduced in scope over the course of its Parliamentary passage.  It will revoke around 600 pieces of legislation at the end of the 2023, and gives the government extensive powers to revoke, restate, replace and update secondary EU Retained law.  It also makes changes to the supremacy of law provisions. Some of these will take effect at the end of 2023 but some have applied since the Act came into force. 

In September 2023, the government laid the draft Retained EU Law (Revocation and Reform) Act 2023 (Consequential Amendment) Regulations 2023 before Parliament under the draft affirmative procedure.   Intended to come in on 1 January 2024, the Regulations will amend certain Acts of Parliament in order to implement sections 2, 4 and 5 of the REUL Act.  The amendments are listed in a 26-page Schedule and mostly relate to changes in terminology. The accompanying explanatory memorandum says they do not reflect any change in policy but provide clarity. The government has also published a web page housing a complete set of Statutory Instruments made under the REUL Act

The draft Retained EU Law (Revocation and Reform) Act 2023 (Revocation and Sunset Disapplication) Regulations 2023 (REUL Regs) were laid before Parliament on 4 September 2023.  The REUL Regs will: 

  • Preserve specified pieces of REUL from revocation at the end of 2023 under the power conferred by section 1(4) of the REUL Act – four pieces of UK legislation and three of Northern Ireland that were originally listed in Schedule 1 of the REUL Act as being revoked are now being preserved
  • Revoke 93 pieces of legislation at the end of 2023 under the s14(1) REUL Act power to revoke REUL without replacing it.  The government says this legislation is either redundant, has been superseded, or no longer has any legal effect in the UK following withdrawal from the EU.

In May 2023, the government had published a list of laws to be revoked in a draft Schedule 1 to what was then the REUL Bill (as discussed here).  Fishing, the environment and food were the three stand-out areas to face change although the list of environmental laws which would potentially have fallen away under the sunset clause was 1700 and only 341 were included in the Schedule.  Even in these areas, many (although by no means all) of the laws identified are no longer relevant in post-Brexit UK.

The good news is that we will certainly not be seeing an immediate collapse of the commercial legal framework in the UK.  The REUL Act does, however, remain controversial due to the wide-ranging ministerial powers to unilaterally revoke REUL using secondary legislation.  A further issue is the degree of uncertainty which will remain as to how to interpret the law going forward, leading to a potential increase in litigation as claimants seek to overturn the effect of CJEU decisions.

The Windsor Framework

The post-Brexit position of Northern Ireland was revised under the Windsor Framework which replaced the Northern Ireland Protocol in February 2023. It effectively enables Northern Ireland to continue to have access to the Single Market while remaining in the UK.

Media Bill

In March 2023, the government published a Draft Media Bill for legislative scrutiny to modernise broadcasting legislation following a 2022 White Paper. Changes include: 

  • Introducing an Ofcom video-on-demand (VoD) standards code similar to the Broadcasting Code for on-demand content made available to the public in the UK. This would bring Netflix, Amazon Prime Video and Disney+ among others into scope
  • Ensuring public service broadcasters' on-demand services are easy to discover on smart TVs and streaming sticks by requiring on-demand providers to give prominence to public service content
  • New reforms to guarantee access to UK radio on smart speakers.

These are part of an extensive programme of reform to the broadcasting sector as a whole. 

A consultation was published which closed on 17 May 2023 and the Bill was formally announced in the November 2023 King's Speech.  The House of Commons Public Bill Committee subsequently issued a call for evidence which closes on 14 December 2023.  The government also published its response to two House of Commons DCMS committee reports on the Bill.

Read more about the Media Bill here.

Bill of Rights

Dominic Raab's 'flagship' Bill of Rights was quietly withdrawn in July 2023.  The Bill was originally presented during the Johnson government, paused under Liz Truss, and then re-commenced before being permanently shelved.  It is currently unclear what, if anything, will replace it. 

Ofcom revised regulatory enforcement guidelines

Ofcom decided to restructure the Regulatory Enforcement Guidelines to make them easier to follow. These guidelines set out how Ofcom approaches enforcement of regulatory requirements and consumer protection law relating to the industries for which it is responsible.

Industry Working Group on Electronic Execution of Documents final report

In March, the Industry Working Group on Electronic Execution of Documents published its final report on its remaining terms of reference covering issues related to the use of electronic signatures in cross-border transactions, and how best to use electronic signatures to optimise their benefits when set against the risk of fraud. Its February 2022 interim report made a number of recommendations for reform and this report now completes its review. 

DBT call for evidence on smarter regulation and the regulatory landscape

The Department for Business and Trade published a call for evidence on smarter regulation and the regulatory landscape in October 2023.  The aim is to understand what might be improved in the way regulators operate and to identify potential reforms to the UK statute book.  The call closes on 7 January 2024. 

UK government consults on reform of Provision of Services Regulations 2009

The government consulted on reform of the Provision of Services Regulations 2009 (PSRs) as amended on Brexit by the Provision of Services (Amendment etc) (EU Exit) Regulations 2018.  The PSRs originally implemented the EU Service Directive.  They apply to most businesses providing services in the UK and, among other things, require businesses to make certain information available to customers and handle customer complaints promptly.  The requirement not to discriminate against individual customers on the basis of their nationality or location was removed on Brexit.

UK implementation of DAC7

The UK's Platform Operators (Due Diligence and Reporting Requirements) Regulations 2023 implementing the OECD's Model Reporting Rules for Digital Platforms, DAC7 rules were made on 18 July 2023 and will come into force on 1 January 2024, with first reports due in 2025. HMRC added a new chapter to its International Exchange of Information Manual by way of guidance on the reporting provisions on digital platforms introduced by the Platform Operators (Due Diligence and Reporting Requirements) Regulations 2023.  We consider the implications of the UK regulations and their implementation in further detail <here<></here<>.

Economic Crime and Corporate Transparency Act

The Economic Crime Act and Corporate Transparency Act received Royal Assent in October 2023 as discussed here. It makes significant changes to the UK's economic crime and fraud regime, including a new 'failure to prevent fraud' offence for large organisations and a widening of corporate criminal liability for economic crimes committed by senior managers.

EU

Digital Markets Act

The majority of the Digital Markets Act (DMA) came into effect on 2 May 2023. Potential gatekeepers meeting the quantitative thresholds had until 3 July 2023 to notify their core platform services to the Commission which confirmed designations in early September 2023. Designated gatekeepers have until 6 March 2024, to comply with requirements under the DMA. The Commission also published updated guidance in the form of Q&As, on the DMA. 

In March, the EC adopted a decision to establish a High-Level Group on the Digital Markets Act as required under Article 40 DMA in March. The Group is made up of 30 representatives nominated from BEREC, the EDPS and the EDPB, the European Competition Network, the Consumer Protection Cooperation Network and the European Regulatory Group of Audiovisual Media Regulators. It will have a mandate of two years and will meet at least annually.   The Group can provide the Commission with advice and expertise to ensure the DMA and other sectoral regulations applicable to gatekeepers are implemented in a coherent and complementary manner. It may also have a role in market investigations and ensuring the DMA is future proof. 

In April, implementing Regulation 2023/814 was published in the Official Journal, entering into force on 2 May 2023, at the same time as the DMA. The Regulation set out arrangements for the form, content and other notification details which platforms acting as gatekeepers under the DMA are required to submit. 

In August, the Commission published a consultation on a draft template to describe profiling techniques of consumers and the audits of those descriptions which Gatekeepers under the Digital Markets Act are required to produce under Article 15. 

On 6 September 2023, the European Commission published its first set of gatekeeper designations and core platform services under the DMA as follows (gatekeeper in bold for the avoidance of doubt): 

  • Alphabet – Google, Chrome, YouTube, Google Android, Maps, Play, Shopping and Search
  • Amazon – Amazon and Amazon Marketplace
  • Apple – App Store, Safari, iOS
  • ByteDance- TikTok
  • Meta – Facebook, Instagram, Meta Marketplace, Meta, WhatsApp and Messenger
  • Microsoft – LinkedIn and Windows PC OS.

The gatekeepers must comply with DMA requirements in full by 6 March 2024 although some obligations eg to inform the Commission of any intended concentration) apply from designation. 

The Commission opened market investigations into submissions by Microsoft and Apple that some of their core platform services do not qualify as gateways despite meeting the threshold for core platform services.  The relevant services are: 

  • Microsoft – Bing, Edge and Microsoft Advertising
  • Apple – iMessage.

It is also looking at whether Apple's iPadOS should be designated as a gatekeeper despite not meeting the thresholds. 

The Commission concluded that while Samsung Internet Browser meets thresholds to qualify as a gatekeeper, it did not qualify as a gateway to reach end users.  It reached similar conclusions about Alphabet's Gmail, and Microsoft Outlook. The full text of these decisions are available on the DMA Gatekeeper webpage

Apple has made two appeals against its designations in relation to the designation of its iOS, AppStore and Safari browser as core platform services, and the opening of a market investigation into its iMessage service on iPads and iPhones.  Meta is appealing its designations for Messenger and Marketplace platforms but not for Facebook, Instagram and WhatsApp.  ByteDance is appealing its designation saying it does not hold an entrenched position in the market and is a challenger not an incumbent in the digital advertising market, and it does not meet the EEA revenue threshold under the DMA.  Microsoft, Google and Amazon are not challenging their designations.

The EC published a series of FAQs covering a range of practical issues under DMA for gatekeepers. This covers topics including how to submit information to the Commission, how the Commission will treat confidential information, and how to respond to information requests from the Commission. 

In October, the European Commission published the template for the annual compliance report that gatekeepers will need to submit under the DMA. These reports have to contain all the relevant information needed by the Commission to assess compliance with Articles 5-7 of the DMA in relation to all core platform services listed in the designation decision. While the template looks an innocuous nine pages, it asks for 'exhaustive explanations' and 'specific information', lists and reports on a huge range of issues. Gatekeepers now have until 7 March 2024 to submit their first reports, after which the Commission will publish a non-confidential summary.  The Commission has also published templates for three further notification requirements. 

See here for more about the DMA.

DAC7 – EU implementation

The EU's regulations implementing the OECD's Model Reporting Rules for Digital Platforms, DAC7, came into effect on 1 January 2023, with Member States having been required to incorporate DAC7 into their national legislation by 31 December 2022. The OECD's Model Reporting Rules for Digital Platforms are intended to ensure the tax compliance of participants in the digital economy and to promote a level playing field between online and traditional businesses.  See more here.

A Green Deal Industrial Plan for the Net-Zero Age

The European Commission adopted a Communication on a Green Deal Industrial Plan for the Net-Zero Age in early 2023. This aims to ensure the EU has access to technologies, products and solutions key to transition to net-zero and that represent a major new source of economic growth and quality jobs.

New legislative proposals were adopted at the European Council summit on 23-24 March 202 including the Green Claims Directive which is intended to provide EU-wide, strict uniform minimum criteria for the communication and substantiation of explicit environmental claims. In order to meet the standards, traders will have to get approval of their green claims from an independent verifier. See here for more.

The European Commission also launched a consultation on which categories of new products to address first under its proposal for an Ecodesign for Sustainable Products Regulation (ESPR). The Ecodesign Directive focuses only on energy-related products, but going forward, the new Regulation will cover both new and energy-related products. Provisional political agreement was reached on the draft Regulation in December. The Commission has identified end-use products (including textiles and cosmetics), intermediary products, and horizontal measures like durability and recyclability, as potential priority areas.

The European Parliament adopted its negotiating mandate on the Directive on Empowering Consumers for the Green Transition. This is part of the Green Package and is intended to give consumers the information they need to make environmentally friendly choices, ban greenwashing, and encourage companies to offer more durable and sustainable products. The Directive is now in trilogue.

EC revised Horizontal Block Exemption Regulations and Guidelines

In June, the European Commission adopted a revised R&D and Specialisation Block Exemption Regulation and revised Guidelines on the application of Article 101 to horizontal co-operation agreements. The guidelines now cover data sharing, mobile infrastructure sharing agreements and sustainability agreements. The new block exemptions entered into force on 1 July 2023 and will expire on 30 June 2035. 

EC  payment services proposals

At the end of June 2023, the European Commission published legislative proposals to reform the EU payment services framework: 

  • Proposal for a Directive on payment services and electronic money services amending the Settlement Finality Directive, PSD2 and the second Electronic Money Directive – this proposal is more widely known as PSD3
  • Proposal for a Regulation on payment services in the internal market and amending the EBA Regulation – referred to as the proposed Payment Services Regulation (PSR)

The proposals are intended to enhance safety and security of electronic payments and transactions in the EU domestically and cross-border, and to provide a wider choice of PSPs.   

European Commission report on Business to Platform Regulation

The EC published a report following its first review of the Platform to Business Regulation (P2BR).  The P2BR imposes transparency, accessibility and dispute resolution obligations on online intermediation service providers (OISPs) (eg e-commerce marketplaces and social media services) and transparency obligations on online research engines.  Equivalent legislation applies in the UK as retained EU law. The Commission proposes measures to educate OISPs and business users about the impact of the P2BR and to encourage Member State enforcement.  It will explore creating codes of conduct in the hotel bookings and online marketplace sectors. 

EC call for evidence in relation to rationalisation of reporting requirements

The EC published a call for evidence in relation to rationalisation of reporting requirements in October 2023 – ie those stemming from EU legislation obliging Member State authorities, private or public organisations to provide structured or unstructured data to competent authorities at EU or national level.  The Commission is looking to understand the burden put on organisations including which, if any, areas are particularly problematic and time consuming, whether there are any which are obsolete or disproportionate, and whether they can be consolidated and/or digitised in certain areas.  Based on the results, the Commission will prepare rationalisation plans for 2024.  The call closed on 28 November 2023. 

EC 2024 Work Programme

The EC published its 2024 Work Programme in the autumn.  This is the final WP of the current Commission and with European elections coming up in 2024, so it contains less new material than usual. Somewhat surprisingly the ePrivacy Regulation remains on the list of legislative initiatives to be completed.  The Commission will now discuss joint priorities with the European Parliament and Council.  One of the proposals is for a Digital Networks Act to "redefine the DNA of our telecoms regulation". 

European Parliament close to agreeing negotiating position on CSAM proposal

The European Parliament adopted its negotiating position on the draft Regulation to prevent dissemination of child sexual abuse material (CSAM) in November following provisional agreement. The Regulation will set up a central EU hub to help fight CSAM in the EU.  The draft Regulation also contains provisions for reporting and take down of CSAM material and has raised concerns that it will be used to circumvent encryption of messaging apps.

In dieser Serie

Technology, Media & Communications

Radar - 2023 roundup

11. December 2023

von Debbie Heywood

Technology, Media & Communications

Data and cyber security - 2023 roundup

11. December 2023

von Debbie Heywood

Call To Action Arrow Image

Newsletter-Anmeldung

Wählen Sie aus unserem Angebot Ihre Interessen aus!

Jetzt abonnieren
Jetzt abonnieren

Related Insights

Radar - 2022 roundup

13. Dezember 2022

von Debbie Heywood

Klicken Sie hier für Details
Technology, Media & Communications

Play: Small print and data

27. Juli 2021
Quick read

von Debbie Heywood

Klicken Sie hier für Details
Gaming

Radar December 2019: Games and eGaming

9. Dezember 2019

von mehreren Autoren

Klicken Sie hier für Details