Both the UK and the EU are focused on protecting online users from illegal and harmful content and have recently introduced wide-ranging legislation in this area. Some gaming businesses will be caught by both regimes.
The UK's Online Safety Act
The UK’s Online Safety Act (OSA) came into force on 26 October 2023. It applies to user-to-user services and search services as well as pornographic content services. For user-to-user services (the category most likely to be relevant to gaming businesses), its requirements focus on online user-generated content.
The OSA regulates illegal content and certain specified types of harmful content, focusing especially on content harmful to children on services likely to be accessed by them. In relation to the most harmful types of content likely to be accessed by children, age verification/estimation must be used (subject to a limited exception).
The OSA is very wide-ranging and Ofcom (the OSA’s regulator) estimates that around 100,000 online services could be in scope. Offline games won't be impacted, but Ofcom has highlighted that some games allow user interaction by creating or manipulating avatars, objects and the environment themselves, and/or by using chat functionality, and that user-to-user gaming services pose specific risks of harm.
Games with the following functionalities are likely to be in scope:
- Text or voice chat functionality (team chat in team-based games, or chat in large open servers which bring player avatars together). There is an exemption for services that only enable user-generated content in the form of SMS, email or MMS messages and/or one-to-one live aural communications.
- Games built around the generation and sharing of user-generated content.
- Virtual reality/metaverse functionality in communal online spaces.
- Games with built-in livestreaming functionality, forums, marketplaces or other platforms which facilitate user-to-user interactions.
Limited functionality services which only allow user-generated content in the form of posting/sharing comments or reviews relating to content published by or on behalf of the service providers, or applying emojis and similar expressions of opinion are exempt.
All in-scope services will need to comply with a range of obligations, including risk assessment and mitigation, protecting users from illegal content and child users from certain harmful content, operating complaints processes, reporting and record-keeping. Most businesses will also need to make changes to their terms of service (or EULAs). Larger services that pose risks of several different types of harm are subject to the most onerous obligations.
Ofcom has published its consultation on protecting people from illegal content online, and an overview and quick guides for online services setting out ‘what you need to know’ to help them understand the first steps, with further consultations to follow.
Understanding whether and to what extent you are in scope, assessing risk to users and putting in place mitigation measures together with processes to comply with safety duties will be crucial. This is not least because of Ofcom's extensive enforcement powers.
In-scope service providers have some time to prepare for full compliance but should be assessing what they will need to do.
The EU's Digital Services Act
The EU's Digital Services Act (DSA) covers similar but different ground to the OSA . It came into force on 16 November 2022 and has applied since 17 February 2024 for all in-scope services (and earlier for certain services designated as very large online platforms or search engines). It has a wider scope of application compared to the OSA and applies to intermediary services (generic term), hosting services, online platforms, online search engines and online trade platforms.
Game businesses may be caught where they fall within the definition of a hosting service or an online platform.
There is a fine line between being categorised as a hosting service and an online platform, especially in the case of game businesses. Functions such as in-game chats, user-generated content, communal online spaces, comment areas and forums can be found in both hosting services and online platforms. The key distinction is the element of public dissemination, which is defined as the provision of information to a potentially unlimited number of people. If this is the case, the classification is likely to be as an online platform rather than a hosting service (subject to limited exceptions). It's also worth noting that if a game meets the relevant threshold of monthly users, the calculation of which follows counterintuitive rules (more on this below), it will be categorised as a "very large online platform" (VLOP) and subject to additional requirements.
The DSA places general obligations on all intermediary services as well as additional specific obligations for particular categories including hosting services and online platforms. While the general obligations primarily concern information and transparency requirements, the specific obligations for hosting services and online platforms mostly relate to protecting users from illegal content which includes content moderation. When moderating content, emphasis is placed on transparency, efficiency (e.g. through a trusted flagger system) and appropriate legal protection.
An online platform is a VLOP if it has more than 45 million average monthly active recipients in the EU. Additional obligations apply here, including around risk assessments and crisis response mechanisms. The DSA Coordinator appoints the VLOPs on the basis of the average monthly active recipients (AMARs) published by all online platforms. A recipient is someone who interacts with the service. Simply visiting a subpage and passively consuming the content is sufficient. No classic interaction such as commenting, sharing or posting is required. You can find out more about the calculation and publication of user numbers here.
Similar objectives, different requirements
The OSA and DSA have similar aims to make the internet a safer place. Both pieces of legislation take a risk-based approach with the most onerous obligations applying to the largest/highest risk services. However, their scope, while overlapping, is not identical.
The OSA applies to the those who provide user-generated content and imposes obligations only in relation to such content (where that content amounts to a criminal offence or is harmful to children). Conversely, the DSA applies to all intermediaries and imposes obligations in relation to all types of illegal content (whether that content amounts to a criminal or civil wrong) as well as systemic risks arising from other types of content and activities. There are also significant differences in the detail such as in relation to which services must conduct risk assessments, the nature of the safety and other duties, what must be included in terms of service, and the extent of the obligations relating to children and advertising. With both Acts now in force (although the OSA effectively not yet applying, with some exceptions), game businesses will need to assess whether and to what extent they are in scope of each piece of legislation and decide on the best ways to implement requirements. See a more detailed comparison between the OSA and DSA here. You can find out more about other aspects of the OSA and DSA here.
Read more
Access the fifth edition of our Play Guide for more on key issues impacting the video game sector.
Access guide