25 July 2022

Building games safely online: The Online Safety Bill

The explosive growth of the digital world has been largely unregulated so far, on the basis that too much regulation would stifle innovation. But high profile examples of sports stars inundated with racist abuse, teens encouraged to commit suicide and the prevalence of child sexual exploitation and abuse has turned the tide of political appetite towards regulation of online spaces. Incoming legislation principally targets the platforms which facilitate offending content and practices which, legislators argue, have profited from the lack of regulation to date.

In a gaming world which is engineering a 'metaverse' to recreate and augment real life interactions online, risks of harm are self-evident. Children may be exposed to adults without the usual buffer of parents, guardians or teachers, and many users have reported virtual groping and verbal abuse in VR settings.

The Online Safety Bill (OSB) is the UK's headline content legislation in this space. The OSB is intended to protect users, particularly children, from online harm. It will mainly focus on user generated content, covering both illegal and harmful content.  It introduces a statutory duty of care on certain online providers to protect their users from harm.  It will apply to:

  • user-to-user services (internet services that enable content generated directly on the service by a user of the service, uploaded or shared by one user to be encountered by another user)
  • search services (a service which allows users to search more than one website or database).

Although it is not yet finalised in the form of an Act (what will presumably be the Online Safety Act), it has undergone substantial legislative scrutiny and was nearing completion before being delayed following Boris Johnson's resignation.

While the largest technology and social media companies are likely to be deemed the highest risk due to high volumes of user-to-user interaction and sharing of user-generated content, online gaming no doubt falls within scope for the same reasons.

Which types of games are caught?

Broadly, online games facilitating player-to-player interaction or players' creation of content are within scope of the OSB. The geographic scope requires that either:

  • there are a significant number of UK players
  • the UK is a target market, or
  • the game can be played in the UK and there are reasonable grounds to believe that there is a material risk of significant harm to UK individuals.

In particular, the following sorts of functionality are likely to fall within scope of the OSB:

  • Text or voice chat functionality (team chat in team-based games such as Counter Strike, or chat in large open servers which bring player avatars together such as Fortnite).
  • Games built around the generation and sharing of user generated content (like Minecraft).
  • Virtual reality/metaverse functionality in communal online spaces.
  • Games with built-in livestreaming functionality, forums, marketplaces or other platforms which facilitate user-to-user interactions.

Games without online functionality, or games with online functionality that don't fit these criteria, are unlikely to be within scope. In the gaming context, there is also an exception for one-to-one live spoken communications. This may apply to spoken chat functionality between two players in a specific chatroom, server or lobby, but as soon as more than two join people join, or if there is also text chat functionality, the exception will not be relevant. 

Of course, organisations within the gaming space but which are not strictly developers or publishers may also be impacted. Games forums, marketplaces, distribution platforms, livestreaming and video-on-demand platforms are all likely to be caught. For more detail on the scope of the OSB, please read here.

What will games businesses need to do?

The OSB orients its regulation around three types of content: illegal content, content that is harmful to children, and content that is harmful to adults. For more detail on what these mean, please read here.

In summary, organisations will have safety duties dependent on the specific type of content they are trying to manage. These include:

  • carrying out risk assessments. It is critical at the outset that businesses understand the sorts of content, players and harms which might be facilitated by their game, as this will inform what other compliance steps are taken
  • using proportionate measures to mitigate risk of harm deriving from illegal or harmful content. What this means will entirely depend on the specific game or environment and the relevant harms. Ofcom (the OSB's enforcer) will publish codes and guidance to support businesses, which may include recommendations to use particular tools for content moderation, user profiling and behaviour identification
  • using proportionate processes to prevent individuals from encountering different elements of the illegal or harmful content. Again, this will depend on the specific game. Protections might include age verification measures, automatic blocking of certain words, acceptable use policies or codes of conduct which are strictly enforced, restricting certain content or functionality to certain player groups, etc
  • using proportionate processes to take down illegal content when aware of it. Games businesses will need to have capacity to be nimble in taking down offending content. Most online games already facilitate inter-player reporting (such as for cheating or offensive language) – this functionality may require expansion and stronger moderation
  • specifying in terms and conditions how individuals are protected from illegal and harmful content and carrying through on these consistently. Greater information will need to be provided to players around the measures deployed for a business to protect its player-base, which might take the form of FAQs or policies which will need to be comprehensible for all relevant ages.
    • Won't somebody think of the children?!

      All games organisations will need to carry out children's risk assessments to establish whether their player-base includes children. This assessment should determine:

      • whether it is possible for children to access the game. Practically, this is likely be the case unless age verification or assurance is used
      • if such access is possible, whether there are a significant number of children who are players, or the game is likely to attract a significant number of child players.

       It seems likely that all but the most mature indie games which are also behind age walls will qualify for this threshold. In addition to the summary requirements for in-scope games businesses, there are requirements which relate expressly to the protection of children. These include an overarching duty to protect children's online safety, breaking down into sub-duties to:

      • manage risks of harm according to different children age groups. For example, depictions of certain violence may be unlikely to harm 16 and 17 year olds, but would be more likely to harm those below 10
      • prevent children from encountering "primary priority content", some of which will be determined by secondary legislation. This means that there will be types of harm which are automatically deemed harmful for children
      • provide for children's reading ability when drafting policies and information around protections deployed by the organisation.

       For more detail on child protection under the OSB, please read here.

      What about the 'what abouts'?

      The OSB leaves open a number of questions and uncertainties. Some of these may be resolved under statutory codes of conduct which Ofcom is obliged to produce but others may come down to operational implementation.

      Which party in the game lifecycle is responsible?

       

      Should it be the developer, the party most responsible for the production of the game? Should it be the publisher, the party funding and commercialising the game? Should it be the distributor (if different from the publisher) wanting to protect the brand of their platform? The OSB itself isn't specifically targeted at gaming and therefore does not address such questions directly. We anticipate that once the OSB is in effect, the contractual arrangements between these parties will determine who is responsible for what, including operational requirements and liability if things go wrong.

       

      What will SMEs and indie developers need to do?

       

      A thorough OSB compliance project, entailing myriad risk assessments and operational changes, is likely to require significant resource to manage. While this is achievable for large organisations which have more cash to spare on compliance processes, there is unfortunately no carve-out in the OSB for organisations under particular thresholds. This means that the likes of indie developers will still be on the hook.

       

      Practically speaking, Ofcom itself will be managing resources and enforcement priorities. It isn't yet clear whether the games industry is specifically in the crosshairs or if it will amount to collateral damage, but it is likely that the smaller the organisation, the less it is likely to attract regulator attention unless it does something particularly egregious. Ofcom has said its regulatory responsibilities will involve it partly "learning on the job" and that it will therefore be engaging with organisations it regulates in the first instance, rather than moving straight to enforcement action. This may allow smaller organisations to shape up rather than face immediate sanctions.

       

      Frequent references to "proportionality" should also help to assuage some fears. What's proportionate for the world's largest publisher will not be the same for a new indie mobile game developer.

       

      In-game personalisation

       

      There will undoubtedly be edge cases of content which is neither clearly in nor out of scope. For instance, are personalised characters or avatars captured? And what about personalised emblems, logos and other in-game items – for example, to indicate association with particular individuals or groups (clan tags)?

       

      Hopefully, any Ofcom guidance or codes of conduct targeted at games will assist in this regard but it may be that edge cases require testing in the courts.

       

      What should you do now?

       

      The UK Children's Code has already drawn a line in the sand to try to protect children and their personal data. Both this and the OSB form part of a wider UK and EU regulatory endeavour to protect people online. It's therefore very important that all gaming businesses, big and small, begin to plan for the implementation of the OSB and greater regulation of problematic online content.

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

Gaming

Play

9 April 2024

by multiple authors

Click here to find out more
Technology, media & communications

Radar - 2023 roundup

11 December 2023

by Debbie Heywood

Click here to find out more
Gaming

(Still) playing on harder difficulty? Continued challenges but vast opportunities for the video games industry in 2024

27 November 2023

by Richard Faichney

Click here to find out more