- First Gen Law attorneys discuss bill targeting deepfakes
- Federal legislation would protect and support individuals
Tennessee’s Ensuring Likeness Voice and Image Security Act, or ELVIS Act, is taking effect July 1. It replaces Tennessee’s narrow Personal Rights Protection Act of 1984, creating a property right in the use of a person’s “name, photograph, voice, or likeness.”
The bill confirms the legal community’s fears of a state race to the bottom to create the most expansive law possible absent federal intervention.
The federal government is uniquely positioned to address the right of publicity and/or deepfakes. Unlike state legislatures, Congress’s obligation is to the entire country—not one region or industry. Constitutional mandate aside, it also has a precious opportunity to protect people, society, and government from an urgent, intersectional, existential threat. But to paraphrase Elvis himself, conversation is no longer enough. Now is the time for serious action.
Practitioners and industry associations such as International Trademark Association and American Bar Association have long advocated for a federal right of publicity to combat the “jagged and unpredictable reach” of the 35 different state-based rights. Such inconsistencies create overall uncertainty and business inefficiencies, makes attorneys unable to effectively counsel their clients, and conflicts with fundamental freedoms.
New York’s right of publicity law was the first, in 1903. It protects a person’s “name, voice, signature, photograph, or likeness” from unauthorized commercial use within the state. New York’s was also the first to include “digital replicas,” in 2021—but only for deceased performers and personalities domiciled in New York when they died.
Louisiana’s Allen Toussaint Legacy Act, enacted in June 2022, provides additional context and contrast with the ELVIS Act. New York, Louisiana, and Tennessee all recognize a person’s identity as descendible, assignable, and licensable property.
All three states try to prevent deepfakes and prohibit unauthorized third-party commercial use, with exceptions for fair uses such as news, sports, and public affairs.
But where others stop at the state line (Louisiana’s law only protects Louisianans from violations within Louisiana), Tennessee extends its right of publicity to anyone “living or dead,” no matter where the individual or the violation is located. This provokes extraterritoriality concerns with other states’ and federal jurisdiction and sovereignty.
Even more shocking is Tennessee’s approach to deepfakes—digital replicas created by AI or machine learning that have presented the legal community with questions about the nature of creativity, originality, and authorship.
Combining the right of publicity and deepfake legislation pits the entertainment industry industry (which runs on performers, creators, and personalities) against the technology sector (which is fueled by corporate innovation and iteration).
Most states have averted this by keeping the two issues separate. Louisiana’s right of publicity statute carefully exempts data collection, processing, and licensing. Like other states, it shields media companies, platforms, and services providers “which carried or transmitted the content” from liability.
But the Tennessee bill levels up on the existing law, which protected misuse “in any medium, in any manner.” It does this by creating a new cause of action for anyone who “distributes, transmits, or otherwise makes available an algorithm, software, tool, or other technology, service, or device” whose “primary purpose or function” is the unauthorized “production” of a protected identity.
As with the right of publicity, states have stepped in to address deepfakes in the absence of meaningful federal action. After the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, the FTC’s Impersonation Rule became effective, but Congress is still discussing two bills on the topic, both of which would establish a federal right of publicity.
The Senate’s Nurture Originals, Foster Art, and Keep Entertainment Safe Act would specifically prevent unauthorized digital replicas of individuals’ names, faces, and voices. The No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act in the House targets “personalized cloning service[s]” and unauthorized “digital voice replica or digital depiction[s]” in interstate and foreign commerce.
On the same day ELVIS was enacted, Wisconsin enacted a law requiring political ads to disclose any synthetic media or AI-generated content. Digital replica legislation has also been introduced in Virginia, Texas, and California.
In 1984, when the Supreme Court declined to make Sony liable for manufacturing “copying devices” (VCRs) in Sony v. Universal City Studios, content was mostly entertainment. Forty years later, many tools to create and manipulate content are free and accessible, and half of Americans get their news from platforms offering all types of content and the technologies to create and manipulate it with.
Even though 50% of Americans don’t know what a deepfake is, 55% are so worried about false information online that they want the US government to step in—even if it restricts freedom of information.
If intellectual property can be as valuable or destructive as real property, should deepfakes be outlawed like counterfeit currency or haphazardly regulated like 3D-printed guns? The legal industry, state governments, and the people themselves are, for once, agreed that they should be. And most of us think this reckoning needs to occur at the federal level.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author Information
Anuj Gupta is managing partner of First Gen Law, with focuses on matters in technology and entertainment.
Rebecca Neipris is senior attorney at First Gen Law, focusing on entertainment and intellectual property.
Write for Us: Author Guidelines
To contact the editors responsible for this story: