The rise of generative AI in film productions
In recent years, the use of generative artificial intelligence (AI) has shifted from theoretical potential to practical application within the film industry. Productions such as The Irishman (2019), which used extensive digital de-aging techniques on actors Robert De Niro and Al Pacino, and Star Wars: Rogue One (2016), which digitally recreated the late Peter Cushing as Grand Moff Tarkin more than twenty years after the actor had passed away, have illustrated the power of AI-assisted image manipulation. A couple of years ago, James Earl Jones consented to AI-based voice synthesis of his iconic Darth Vader voice for the 2022 'Obi-Wan Kenobi' series. On 16 May 2025, the online video game Fortnite Battle Royale added a Darth Vader non-playable character to the game, which players can converse with via generative AI modeled after Jones' voice: Fortnite developer Epic Games received permission from Jones' estate to use his voice in the game.
Generative AI differs fundamentally from prior digital effects (CGI, VFX) by creating new content based on training data, often using vast datasets that include prior performances, images, and recordings of actors. This technological leap offers producers cost-saving potential, from reducing shooting days to enabling actors to 'appear' in scenes without physical presence. However, it also raises profound legal and ethical questions, particularly concerning the personal rights of actors whose likenesses, voices, and performances may be digitally replicated, altered, or reused.
Cost efficiency vs personality rights – how to balance opposing interests
From the producer's perspective, generative AI represents an opportunity to streamline production, reduce physical risks for actors (particularly in stunt work), and enable flexible scheduling. For international co-productions and large-scale streaming projects in particular, AI-based solutions can help lower production costs while preserving high-quality output.
Conversely, actors face significant threats to their personality rights. Their 'performance identity' - including image, voice, mannerisms, and emotional expression - constitutes both an essential element of their livelihood and a legally protected personal right. Under German law, this is primarily anchored in the right to one's own image pursuant to sections 22 and 23 of the German Art Copyright Act (Kunsturhebergesetz). The key areas of legal tension include:
- Unauthorised digital replication: creation of digital replicas ('digital doubles') without actor consent.
- Extended use of likeness: reuse of digital models in sequels, remakes, or other projects not originally agreed upon.
- Manipulative transformations: substantive changes to the actor's appearance, age, gender, ethnicity, or voice.
- Compensation models: fair remuneration for use of AI-generated performances, especially when actors are absent.
Actors' unions fear an erosion of their bargaining power if studios can indefinitely reuse or manipulate AI-generated versions of their work without proper consent or remuneration. This could fundamentally alter labor relations within the entertainment industry.
The German compromise: the AI Annex for film and TV productions
In October 2024, German producers' associations and actors' unions signed a detailed annex to the collective bargaining agreement for film and television professionals employed for the duration of production, establishing a comprehensive framework for the use of generative AI in film productions (AI Annex). Key elements include:
- Clear definitions: the AI Annex distinguishes between traditional digital effects (CGI, VFX) and generative AI, which creates new content based on probability algorithms and training data.
- Consent as cornerstone: any creation, duplication, distribution, or public display of a digital replica requires the express consent of the actor, independent of how the replica was created (including whether data came from past footage or motion capture scans). Such consent must be specific to the intended use, documented in writing, and accompanied by separate financial compensation.
- Reuse restrictions: using digital replicas for projects beyond the original production (eg sequels, prequels, remakes) requires fresh consent and separate remuneration unless the actor is re-engaged for the new production.
- Permissible modifications: routine post-production adjustments (eg color grading, lip-sync for dubbing, continuity corrections) are allowed without specific consent. However, substantial changes to an actor's appearance or performance, particularly when deviating from the original script, require explicit permission.
- Risk management carve-outs: generative AI use is permitted to replicate actors in dangerous scenes (eg stunts), provided the actor cannot unreasonably withhold consent.
- Compensation models: when actors permit use of their digital likeness in lieu of physical performance, remuneration is calculated based on the concept of 'fictive shooting days' estimated from the scene length and complexity.
- Artificial composite actors: the AI Annex also addresses fully synthetic performers generated from combined features of multiple real actors, recognising legal exposure if such creations remain identifiable to any contributing actor.
- Ongoing evaluation: recognising rapid technological changes, the parties committed to semi-annual reviews to adapt the agreement.
The international landscape and the SAG-AFTRA agreement in the US
The AI Annex is an example for emerging labor agreements in other jurisdictions on the use of generative AI in film productions. Discussions are ongoing in several countries including Canada, France and the UK. Following the high-profile 2023 Hollywood strikes, SAG-AFTRA negotiated protections requiring actors' informed consent for AI replicas, including digital doubles and voice models, with disclosure obligations and remuneration agreements. This allows limited reuse for franchise projects under specific conditions but has been criticised for lacking perpetual consent mechanisms and stronger revenue-sharing clauses.
To the extent information on the SAG-AFTRA agreement is publicly available, a comparison reveals notable commonalities with the AI Annex. Both frameworks require actors’ informed consent for the use of digital replicas and mandate compensation. They also recognise the need to distinguish minor post-production adjustments from generative AI applications that simulate new performances. In addition, both include specific references to stunt or risk scene exceptions and attempts to address emerging concerns like synthetic actors, albeit in varying degrees of detail.
Despite these shared principles, several key differences remain. The German solution appears to adopt a more formalised structure, including a commitment to biannual reviews, and introduces the concept of 'fictive shooting days' as a basis for compensation - a detail apparently not included in the SAG-AFTRA framework. Conversely, the US model provides more specific procedural requirements for disclosure by producers and broader reuse permissions when agreed upon in initial contracts. The US agreement also includes clauses regarding legacy contracts and deceased actors, areas not explicitly addressed in the German agreement.
Wrap-up
The 2024 German AI Annex represents a sophisticated attempt to balance the economic interests of producers with the dignity and professional integrity of actors in the era of generative AI. Its model of consent-based licensing, clear remuneration rules, and flexible carve-outs for risk scenarios offers a legally sound and pragmatic framework for a complex issue. For the global entertainment industry, it may well serve as a further reference for additional rules on responsible AI integration in creative industries.