A Deepfake App Could Be in Deep Trouble with California Celebrities
A deepfake is an image or video of a person, often a celebrity, who has been digitally altered using an artificial intelligence (“AI”) application to appear to be someone else. Deepfake technology has the potential to be either a boon or a bane to celebrities. As a boon, celebrities can use deepfake technology to extend their professional opportunities. For example, Bruce Willis licensed his likeness to AI-based content creation company, Deepcake, to create a “digital twin,” which was used in a Russian telecom commercial. Even though Willis might have health issues that limit some activities, his digital twin effectively allows Willis to continue his professional career indefinitely, regardless of his health or age, and without the need for his physical presence.
Deepfake technology can also be a bane to a celebrity when used to exploit their likeness without compensation or authorization, which is the alleged situation in Kyland Young v. NeoCortext, Inc., No. 2:23-cv-02496-DSF-PVC (C.D. Cal.). Defendant NeoCortext’s deepfake app, Reface, allows a user to swap their face with someone else, including a celebrity, but the celebrity has not necessarily authorized NeoCortext or the user to show or manipulate their likeness. Reface uses the likeness of former Big Brother contestant, Kyland Young, among others, without authorization. Consequently, Young filed a complaint against NeoCortext on April 3, 2023 asserting a single cause of action—violation of the California Rights of Publicity Statute (Cal. Civ. Code § 3344). Young filed the case as a putative class action on behalf of an indefinite number of other similarly situated citizens of California.
California’s Right of Publicity Statute protects individuals from having their likenesses used by others, knowingly and without permission, on products or to advertise or sell goods and services. Young claims that NeoCortext has violated this statute by using his likeness, and the likeness of other celebrities, to encourage users to pay to upgrade their version of the app.
Young alleges that when a user initially opens the app, it shows them a background video of a user swapping his face with several celebrities and well-known fictional characters. Next, the app provides the user with access to the Pre-sets catalogue, which the user can search until they find an individual they want to become. They then upload their own image or video to the app. Reface scans the user’s uploaded image or video and generates a new image or video swapping the face of the individual in the Pre-set catalogue with the face from the user’s uploaded image or video. An image generated by the free version of the app contains a prominently-displayed, irremovable watermark stating “made with reface app.”
The free version of the app also has a button labeled “Watermark” that shows a picture of a water drop crossed out. Clicking that button prompts the user to upgrade to the “Pro Version” of Reface for a monthly subscription fee or a single lifetime payment, which allows the user to generate deepfakes without watermarks. Young alleges that NeoCortext commercially exploits his likeness and the likenesses of other class members to promote paid subscriptions in two ways: (1) allowing users to pay to remove watermarks which detract from the images; and (2) using watermarks as free advertising to attract new users to Reface. This case is in its very early stages and remains pending.
This case is a reminder that creators and users of deepfake apps and other generative AI technology should be mindful of third party rights, especially when they exploit those rights for their own commercial benefit. An image or video is not in the public domain merely because it is available on a public website, and different rights in a work (e.g., rights of publicity and copyright) might have different owners and restrictions. Moreover, while the face swapping by apps like Reface appears to go only one way (i.e., replacing the face of an individual in the Pre-set catalogue with a face in an image uploaded by the user), creators whose apps allow face swapping in the other direction (i.e., putting a celebrity’s face onto someone else’s body), could give rise to even more issues.