author photo
By Myriah Jaworski
Tue | Apr 16, 2024 | 4:01 PM PDT

Laws prohibiting the use of a person's likeness for commercial gain have been in effect for some time, testing everything from the value of an influencer's endorsement to "freemium" reports by people search companies.

Right of Publicity (ROP) claims, like the state statutes and common law upon which they rely, are varied in substance and outcome. But in general, they are brought to remedy some alleged misuse of a person's actual likeliness—their image in a photograph or voice in a recording. But what happens if the alleged likeness is entirely AI-generated?

To the extent AI-generated deepfakes and voice imagery did not already fit within the existing right of publicity laws, states are amending those laws or enacting new ones specifically to address AI-derived content. This legislation follows advocacy by musicians and entertainers, including an open letter issued by the Artists Rights Alliance on behalf of 200+ songwriters and musicians, calling out the commercial use of AI vocals and imagery, including in training data for AI models, "predatory" and one that "must be stopped."

For example, in March 2024, Tennessee enacted the Ensuring Likeness, Voice, and Image Security (ELVIS) Act of 2024 (effective July 1, 2024), becoming one of the first states to enact legislation that directly regulates the misappropriation of an individual's identity through use of generative AI (e.g., using AI to create a deepfakes or clone an artist's voice). This right, as one may imagine from a law with the acronym ELVIS, extends to provide postmortem rights to the estates of deceased individuals.

And more is on the way. Currently, there are at least seven proposed state and federal laws that seek to regulate the use of AI to create deepfakes or otherwise impersonate unique attributes of an individual's identity (i.e., their image, voice, or likeness). Federal regulators are getting involved too—the FTC and US Copyright Office both have active initiatives around AI replication of appearances, voices, or other unique aspects of an individual's identity.

Importantly artificial intelligence services, internet platforms, and other technology companies should remain vigilant of this active and proposed regulation, as they could be subject to liability based on content available through their services or platform.

This article discusses Tennessee's new ELVIS law and other proposed legislation and regulations relating to AI replication of appearances, voices, or other unique aspects of an individual's identity.

Tennessee's ELVIS Act

Liability

Prior to the amendment, Tennessee's existing right of publicity law, the Personal Rights Protection Act of 1984, Tenn. Code § 47-25-1101 et seq. (TPRPA), prohibited the "unauthorized use" of an individual's "name, photograph, or likeness" for a "commercial purpose." The ELVIS Act amended the TPRPA to similarly prohibit the unauthorized commercial use of an individual's voice.

"Voice" is broadly defined as "a sound in a medium that is readily identifiable and attributable to a particular individual, regardless of whether the sound contains the actual voice or a simulation of the voice of the individual." Generative AI is already powerful enough to use artists' audio samples to create new music that duplicates the artists' voice and sound. This broad definition is aimed at regulating this use of audio samples to create AI-generated deepfakes or audio cloning.

The ELVIS Act created potential liability for not only those who create AI deepfakes or audio clones, but also those who publish and distribute unauthorized AI-generated content and create algorithms, software, or tools that allow for the creation of AI-generated content. Additionally, while the former TPRPA prohibited unauthorized use that was both "knowingly" and "commercial," these liability provisions are not expressly subject to such restrictions.

Thus, these provisions have massive implications for artificial intelligence services, internet platforms, and other technology companies. The ELVIS Act's lack of knowledge and commercial use requirements could be read as indirectly imposing content moderation requirements against the sharing of content among users. The ELVIS Act could also impose liability against creators of AI tools, even where the tool's creator itself does not create the infringing content (or derive a profit from it).

Exemptions

The ELVIS Act narrows the exemptions under the TPRPA in two ways. First, the ELVIS Act narrows fair use exemption. The TPRPA formerly identified "fair use" as any use "in connection with any news, public affairs, or sports broadcast or account." Now, the fair use is coextensive with any "use [that] is protected by the First Amendment." Facially, this is not a significant change (as the former "fair use" definition was intended to embody First Amendment protections). However, the shift in language potentially raises the burden on the party invoking the defense because it must prove a protected use under the First Amendment rather than relying on the statutory language.

Second, the ELVIS Act narrows the TPRPA's advertising exemption under Tenn. Code § 47-25-1107(c). The former version of TRPRA did not hold an advertising publisher (e.g., newspapers, magazines, and radio stations) liable for publishing an advertisement with an authorized use unless it knew of unauthorized use. Under the ELVIS Act, the advertising publisher is now subject to liability if it knew or "reasonably should have known" of the unauthorized use.

Remedies and enforcement

The ELVIS Act's remedies are the same as those existing under the TPRPA. The Act allows injunctive relief to prevent further unauthorized use and/or destruction of materials, and actual damages (including profits derived from the unauthorized use), plus attorneys' fees and costs.

However, the ELVIS Act has one important change as to who has standing to bring claims under the TPRPA. The ELVIS Act grants standing to a licensee (i.e., a record company) when it has either "a contract for an individual's exclusive personal services as a recording artist" or "an exclusive license to distribute [the artist's] sound recordings."

Other proposed litigation and regulation

Artificial intelligence services, internet platforms, and other technology companies should remain aware of other proposed laws and regulations in this space:

Federal activity

State activity (proposed law or amendments)

  • California (AB 1836): This proposed law would create civil liability for the use of "digital replicas" (i.e., AI-generated deepfakes or voice replication) of deceased celebrities. The bill defines "digital replica" as a "simulation of the voice or likeness of an individual that is readily identifiable as the individual and is created using digital technology."
  • Illinois (SB 3325): The proposed legislation would amend Illinois' Right of Publicity Act to expand the definition of "identity" to include a "simulation of the attribute of an individual, or is created through the use of artificial intelligence." The amendment would also give standing to third parties who have exclusive distribution rights.
  • Kentucky (SB 317): The proposed legislation, much like the ELVIS Act, regulates the authorized commercial use of individual's identity through AI-generated content.
  • Louisiana (SB 217): This proposed legislation amends the Louisiana Election Code to require disclosure of AI-generated deepfakes or digital replication of political candidates.

Regulatory activity

Increase in ROP litigation to address AI

To date, courts have repeatedly beat back efforts by artists, comedians, and authors to address the use of their content as training data for AI models through copyright or other IP protections. But, where the stringent standards of copyright law may be an imperfect fit, the flexibility of the right to publicity or ROP claims may provide a better avenue to address the same or similar harms.

For example, in Andersen v. Stability AI Ltd., while the District Court for the Northern District of California dismissed the plaintiff's (a visual artist who alleged her "artist identity" was misappropriated by use in Stability's training models) right of publicity act claims, it did so with leave to amend and specifically directed that the plaintiffs "clarify their right of publicity theories as well as alleged plausible facts in support regarding each defendants' use of each plaintiffs name in connection with advertising specifically and any other commercial interests of the defendants."

And, in In re Clearview AI, Inc. Consumer Privacy Litigation, the District Court for the Eastern District of Illinois held that the plaintiffs, a class of consumers who alleged that their online information was scrapped by Clearview AI for use in its AI model, had adequately stated ROP claims under both California and New York statutory and common law because the plaintiffs had alleged the use and scrapping of their photographs and photographic images online, and that the use in the Clearview AI database was sufficiently commercial, such that the plaintiff did not need to plead a separate and specific advertising of their name or likeliness.

With the ELVIS Act and other state and federal proposals, we expect to see a continued increase in ROP claims to address AI-generated content.

This article was co-written by Myriah V. Jaworski, Chirag H. Patel, and Nicolas V. Dolce and appeared originally here.

Comments