Author
Erik Steiner

Erik Steiner

Senior Associate

Read More
Author
Erik Steiner

Erik Steiner

Senior Associate

Read More

9 May 2023

Machine Rising: Keep your legal mind sharp so your video games stay on the market

  • Briefing

Issue #1

The video games industry has been using artificial intelligence (AI) for decades. Looking back as far as to the vacuum-tube computer age (1950s) the first chess machines capable of playing chess or reduced chess-like games entered the market. Only 40 years later (1997), chess engines running on super-computers or specialized hardware were capable of defeating even the best human players. Although the use of AI is not new to the video games industry, the recent rise of new technologies like Chat GPT or Stable Diffusion, may change the way games are designed, developed, and experienced.

The simple entry barrier makes generative AI models like Chat GPT feel more exciting and demonstrable, as, with simple prompts, pictures and even computer code can be drafted within seconds. But as we all know “with great power comes great responsibility” or at least a lot of associated legal issues.

What is generative AI?

Generative AI works by learning patterns and structures from a huge volume of data sets / content. This data typically consists of text, images, or other forms of media such as music or computer code. The trainings data for an AI model can be obtained from various sources, such as publicly available datasets like Common Crawl, by scraping the web or mining open-source repositories. Once the AI tool is trained to recognize certain patterns it can generate output based on a text input (prompt). In the video games industry, there are many potential use cases, reaching from art, music, computer code to level design, pitches and marketing materials.

With generative AI there is also a possibility to increase player experience in-game by creating life-like situational developments. This ensures that gamers are hooked to the game. The following list is a non-exhaustive enumeration of use cases:

  1. Non-player characters (NPCs): In connection with NPCs, game AI was already used in the early days of video games development. NPCs are characters in the game who act intelligently as if they were controlled by human players. These behaviors are typically determined by algorithms (like a “simple” state machine). With generative AI the character can interact with players in a more realistic and dynamic way, adding immersion or challenge to the game.
  2. Content generation: AI can figure out the ability and emotional state of the player and tailor the game based on this condition. This could even involve dynamic game difficulty balancing in which the game is adjusted in real-time or generation of new content such as enemies, items and levels.
  3. Accessibility: By creating virtual assistants or automatically creating personalised elements for individual players’ needs, video games can be more inclusive for players from different linguistic backgrounds or disabilities. 
  4. Fraud detection: AI can be used to detect cheating or hacking. This can help maintain a fair and enjoyable experience for the player base. 

With the rise of use-cases numerous legal issues are associated. However, the issues are even more complicated as video games are multi-jurisdictional products and each jurisdiction has its own laws and courts. The following are a few examples of those issues mostly from an Austrian and EU perspective, which will be looked into in further detail in the coming issues of this article-series.

Legal issues with generative AI

  • Can you use trainings data for your AI-model?
    When using data sets for training the AI-model, the content often is copyrighted material like game art, characters, computer code, music, etc. The risk is: The use could constitute an infringement of copyrights.
  • What if the AI uses open-source? 
    Using training data that is pulled from open-source software or data repositories could lead to legal problems. A challenge with open-source content is that even though the materials are usable without royalties they are typically subject to license terms. In certain cases, any derivative must retain the copyright notice, attribution to the author or identifying modifications. Currently, hardly any AI tool complies with these requirements. 
  • Is the output copyrightable?
    Under the Austrian Copyright Act, copyrightable work must be a peculiar intellectual creation of human beings. Is the creation of a prompt already sufficient enough to create copyrightable work? If a developer uses an AI tool but also exercises substantial human involvement in the creative process, it will likely be protectable by copyright. However, it is not clear where this line would be drawn.
  • Who is liable for infringing copyright created by AI?
    Since the AI-model is trained with copyrightable work, the created work by AI could be so similar to the training materials that it could constitute a derivative work or even a copy. The provider of AI tools drafted their terms of service in an attempt to shift liability reg. the output to users. 
  • Is there a problem with data protection? 
    AI tools can also raise privacy concerns. Depending on the content used to train the model, privacy issues may arise, e.g., an AI tool analysing the movement of players.
  • Is it possible to infringe competition law through AI tools? 
    As AI-driven games become more prevalent, issues such as misleading or aggressive business practices and children advertising in games as well as unauthorised use of third-party intellectual property have to be considered when using/developing AI.

The potential legal consequences additional to the typical fines for the violations of data protection laws are far reaching. The affected parties can claim for cease & desist, removal of the video game from the stores and compensation for damages. Furthermore, it is standard practice to claim for a publication of the court decision on the website, the game store and/or the video game itself. Last but not least, in case of unfair competition claims, it is also possible that the claimant is a competitor.

Conclusion

In order to stay on top of the curve AI is essential, but its use can produce significant legal issues. Given that the EU and other countries are currently working on regulations for AI, the possible legal issues are likely to increase. Therefore, it is necessary to minimise liability and assess AI tools to mitigate connected risks early in the integration process of AI.

Watch out for more articles on this topic to follow!

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe