Deepfake technology is becoming more sophisticated and attracting interest from the gaming and entertainment industries. However, the use of deepfake technology is largely unregulated. What are the legal risks associated with its use?
There has recently been viral speculation in the media that acclaimed actor Bruce Willis has sold the rights to his ‘Like and Likeness’ to US AI (artificial intelligence) firm Deepcake.
Willis partnered with Deepcake in 2021 to create its digital twin to be deployed across 15 Russian telecom companies. The digital twin was created using deepfake technology, which identified 34,000 images of Willis from various photos and videos from the films Die Hard and Fifth Element. You can see from this Facebook video that the resemblance is uncanny.
Deepfake technology is becoming more sophisticated and attractive to both the gaming and entertainment industries. It allows sampling real actors’ facial expressions and facial features to generate real-time facial movements based on chat-generated text. This means the cost of producing a virtual character to be used in a game, say your favorite sports star, could be halved.
So what is deepfake technology and what are the legal risks associated with its use?
What is a deepfake?
A deepfake is an imitation of a video, audio or photo that appears real but is the result of artificial intelligence manipulation. They are an example of the term “synthetic media,” which is images, sounds, and video that appear to have been created through traditional means, but are actually constructed through complex software. The software is generally advanced forms of machine learning and AI.
You may have come across the @deeptomcriuse TikTok account, which has posted dozens of deepfake videos impersonating Tom Cruise and has attracted around 3.6 million followers. The videos look virtually identical to the real deal.
But this technology is not just limited to TikTok. In 2021, visual artist Chris Umé released a video in which he applied the deepfake technique to the faces of FIFA players. Gaming companies are now exploring future deepfake opportunities within the gaming industry, because who wouldn’t want to play a game with a lifelike avatar of their favorite character? war of stars? An even more immersive gaming experience would not only consist of controlling the character, but also having the avatar follow your face and mouth movements – something that deepfake technology makes a reality.
The legal risks of deepfake technology
Deepfakes are evolving rapidly in a largely unregulated area of technology. There are significant concerns about their misuse, such as fraudulent impersonation for financial gain. For example, Patrick Hillman, Chief Communications Officer of blockchain ecosystem Binance, claimed that scammers deepfaked him to trick contacts into doing Zoom meetings with him. The scammer would then use the deepfake in a meeting with those customers.
Because of these concerns, companies like Google have banned the training of AI systems that can be used to generate deepfakes on its Google Colaboratory platform. It also released a database of 3,000 deepfakes to help researchers develop the tools they need to bypass malicious deepfake videos.
copy protection problems
The problems related to copyright are likely to be complex.
In Bruce Willis’ example, if a developer is considering using images from Willis’ films to create the deepfake technology, they may need to get permission not only from the actor, but also from the production company. In general, the copyright in a finished film belongs to the person who arranged for the company to make it (producer or production company). Unless otherwise agreed, it is likely that permission (ie a license) for use will need to be obtained from the production company. Due diligence is required in the creation of deepfake technologies to ensure that the developer does not infringe any form of copyright granted to any person involved in the creation of the film and/or images used.
Australian Consumer Law Compliance
A developer must also consider their obligations under the Australian Consumer Law.
Section 18 of the Australian Consumer Act provides that a person in commerce or business must not engage in conduct that is misleading or deceptive or is likely to mislead or deceive. A developer must ensure that all necessary permissions have been obtained from the person whose image is being used for the deepfake technology. If the permissions are not obtained and the technology is developed, the developer could face significant liability for misrepresenting that the individual endorsed the use of their images and is otherwise affiliated with the technology.
As deepfakes continue to evolve, game developers exploring this technology will need to preemptively rethink their existing contractual arrangements and grapple with new laws that may be introduced. It will be important to ensure that the necessary intellectual property and license agreements are in place to adequately document ownership of an individual’s digital image.
We believe that these agreements could be quite complex and will require considerable expertise to ensure game developers’ rights are adequately protected.