Skip to content

The rapid evolution of generative artificial intelligence (AI) since the release of ChatGPT in November 2022 is nothing short of astonishing. Large Language Models (LLMs) and other generative AI models continually surpass each other in scope, capabilities, and realism. Not only text but also image, audio, and video generators are now almost indistinguishable from human-created content to the untrained eye and ear. This raises a multitude of legal questions, particularly with regard to video game development, especially concerning deep fakes and personal rights.

Legal Framework and Risks

In Germany, various laws protect the personal rights of natural persons. These primarily include the general right of personality (Art. 2 para. 1 of the Basic Law for the Federal Republic of Germany (Grundgesetz, GG) in conjunction with Art. 1 para. 1 GG), the right to a name (Sec. 12 of the German Civil Code (Bürgerliches Gesetzbuch, BGB)), the right to one’s own image (Secs. 22 et seq. of the German Act on the Protection of Copyright in Works of Art and Photography (Kunsturhebergesetz, KUG)), as well as the right to the written and spoken word. Even without the use of AI, developers and publishers should always be aware of potential infringements of third-party personal rights.

The use of AI models to generate NPCs (non-player characters) in video games, however, inherently carries the risk that infringements of personal rights may occur more quickly and—without proper control of the AI’s output—go unnoticed. The use of voice generators that can imitate a specific voice with just a few training data, as well as 3D scanning techniques for facial features and corresponding gesture analysis, can be used to create highly detailed virtual imitations of a person. A generative AI model trained on speeches and tweets of a celebrity—e.g., Scarlett Johansson (think of OpenAI’s presentation)—could violate the personal rights of that person if a corresponding NPC is used in a video game without the required consent.

Overview of Relevant Personal Rights

General Right of Personality

The general right of personality fundamentally protects the individual’s freedom and dignity. A violation could occur if the depiction of the character simultaneously portrays an imitation of the original person in a degrading or intrusive manner, even if the name and likeness are not explicitly used.

Since video games can also be regarded as works of art within the meaning of Art. 5 para. 3 GG, and the freedom of art may then conflict with the affected person’s general right of personality, a balancing of the opposing rights and interests must always take place.

Right to One’s Own Name

The right to a name protects a person’s name from unauthorized use (Sec. 12 BGB). This applies to both the civil name and stage names or pseudonyms.

It is important to note that mere mention of a name does not necessarily trigger a violation of the right to a name. Rather, a significant portion of the public must believe that the name’s bearer has consented to the use of the name. This is likely to be the case, especially if NPCs bear the names of well-known persons who play a particular role in the video game.

Right to One’s Own Image

The right to one’s own image is derived from Secs. 22 et seq. KUG, prohibiting the dissemination or public display of a person’s image without consent. The most relevant exceptions for video games are found in Sec. 23 para. 1 no. 1 and 4 KUG, where images may be disseminated or publicly displayed without consent if they are from the realm of contemporary history (no. 1) or not commissioned, provided that dissemination or display serves a higher artistic interest (no. 4). However, this permission does not extend to dissemination or display that violates a legitimate interest of the depicted person or their relatives if they are deceased (Sec. 23 para. 2 KUG).

Thus, as is often the case in law, a balancing of the developers’ and publishers’ interests on one side and the interests of the depicted person on the other is required.

“Deep fakes” pose a particular challenge concerning the right to one’s own image. The use of AI to create hyper-realistic images, videos, and 3D models as NPCs can give the impression that the depicted person is genuinely involved in the project. In relation to the artistic copyright exception, the question arises whether the legal basis for permission even applies to deep fakes. Additionally, with deep fakes, the legitimate interests of the depicted person in protecting their right to their image are likely to carry particularly significant weight.

Right to the Spoken Word

As part of the general right of personality, the right to protection from unauthorized recording and dissemination of spoken words is also safeguarded. This particularly applies to private and confidential conversations where the participants have not consented to the recording or dissemination.

In the context of game development, the right to the spoken word could be affected, for example, if the voice of a real person is used in a game without that person’s consent. This is especially true when voice generators are used that can imitate a person’s voice based on minimal data. Here too, developers and publishers must ensure that the individuals concerned give their explicit consent before their voice is used in the game to avoid legal consequences.

Deep Fakes in the AI Regulation

Where deep fakes were once only addressed in terms of personal rights protection, the AI Regulation (also known as the “AI Act”) now applies. The AI Act is an EU regulation that establishes harmonized rules for the development, marketing, and use of artificial intelligence. Its aim is to protect the fundamental rights of natural persons and ensure that AI systems and AI-generated content are used transparently and safely.

The AI Act explicitly regulates deep fakes. Art. 50 of the AI Act obliges providers and operators of AI systems that generate or disseminate deep fakes to clearly label such content as artificially generated. Here is a brief overview of the transparency obligations:

  • For providers (Art. 50 para. 2 AI Act): Those who develop or market an AI system that generates generative AI audio, image, video, or text content must ensure that such content is machine-readable and clearly marked as artificially generated. This obligation may not apply to video games if the generative function is merely supportive of standard processing or if the operators’ input data is not significantly altered.
  • For operators (Art. 50 para. 4 AI Act): Operators of AI systems that generate or manipulate deep fakes are also required to disclose that the content was artificially generated. For most video games, the relief provided in Art. 50 para. 4 sentence 3 AI Act will be relevant, whereby the transparency obligations are limited to disclosing the presence of deep fakes in an appropriate manner that does not impair the presentation or enjoyment of the work, where the content is part of an obviously artistic, creative, satirical, fictional, or analogous work or program.

Deep fakes are not automatically classified as “high-risk” AI. According to Art. 6 AI Act, the risk is assessed based on the potential dangers to the safety, health, or fundamental rights of individuals. Deep fakes are not a standalone category of high-risk AI systems. Therefore, when using an AI system that enables the creation of deep fakes, a proper assessment must always be conducted in accordance with the general principles of Art. 6 AI Act (you can read more about this in our article “Risk Classification under the AI Act”). Should the outcome of the assessment classify the system as high-risk AI, further obligations will arise for developers and publishers.

Practical Takeaways:

Developers and publishers of video games should take the following measures to avoid legal and ethical conflicts:

  • Obtain consent: Ideally, developers and publishers should ensure that they obtain the explicit consent of affected individuals before depicting them in their games. This applies to all aspects of personal rights.
  • Responsible use of deep fakes: The use of deep fakes carries particular risks, both legally and ethically. Careful consideration should always be given to whether their use is justified.
  • In-house compliance: Developers and publishers should establish clear guidelines for the use of these technologies in their game development and ensure that they are followed.
  • Ensure transparency: Developers and publishers should inform players about the use of AI and deep fakes in their games and demonstrate responsibility in handling these technologies.

Conclusion

The use of AI and deep fakes in video games brings not only technical innovations but also new legal and ethical challenges. Developers and publishers should be aware of these risks and handle these technologies responsibly to respect personal rights and avoid legal issues. Therefore, even when using AI to generate NPCs, it remains essential to exercise care and consider all aspects adequately.

Are you planning to integrate AI and deep fakes into your next game project? We will gladly support you in efficiently incorporating the relevant legal requirements into your development processes. Let’s work together to find practical solutions that combine innovation and legal certainty.