In video games, artificial intelligence (AI) has not been a foreign word for a long time. NPCs (non-player characters) and bots have always been described as AI. But the advances in the sphere of actually capable, generative AI, such as the proprietary tools for AI game development called NVIDIA Omniverse ACE (Avatar Cloud Engine) announced by graphics card manufacturer NVIDIA the Computex 2023.
The technological advances in the field of generative AI have opened up unprecedented possibilities for creating NPCs and bots and filling them with “life”. This enables an even deeper level of immersion in the already very immersive video game medium.
The implementation of communication capabilities driven by generative AI may lead to situations where it is no longer possible for gamers to easily distinguish real people from AI. This can have both positive and negative effects.
Positive effects of this would be, for example, faster match-making in player-versus-player modes (PvP) or an even more lively world in an MMO (Massive Multiplayer Online Game). This can significantly extend the life cycle of video games, as players regularly avoid online games that only have a small active community (often referred to as “dead”) and switch to “livelier” games instead.
Negative effects, on the other hand, can occur if developers, for example, come up with the idea of using generative AI without informing the players about such use. If a game is not considered dead, it is more likely to attract players. If players do not know that the game has many bots, they assume that many people are playing the game and are more willing to invest their valuable time and money. This can lead to a disadvantage for other, comparable games that do not use AI or openly communicate the use of AI, because players of online games are primarily interested in the interpersonal aspect – be it socialising and entertainment, or measuring themselves and their own skills against other people.
Under certain circumstances, this use of generative AI could be regarded as an act relevant under competition law. There is a risk that such integration of AI bots could be considered unfair within the meaning of Sections 3 (1), 5a (1) of the German Act against Unfair Competition (UWG) if the manufacturer does not inform the players accordingly and the players cannot easily distinguish between a real person and an AI bot. This can then give rise to claims for injunctive relief (Section 8 UWG) and damages (Section 9 UWG).
Furthermore, there is the risk that the AI, in an attempt to emulate authentic human communication behaviour in the game, may also make offensive or discriminatory statements directed against human players. If this should amount to a violation of personal rights, the game manufacturers would be liable. In addition, AI behaviour that is not suitable for minors can lead to an unintended higher classification of the game by the German rating agency for games (USK).
The following should be noted for the practice:
- If generative AI is used in online video games to give the players the impression of a more active game, developers should label this accordingly. Otherwise, there is a risk of competitors issuing legal warnings under competition law.
- If the AI has chat functionalities, you should make sure that it cannot be coerced by players into being abusive, insulting or discriminatory. If it does, then the game studio might be liable for any resulting personality violation.