Artificially-generated and altered images or recordings, commonly referred to as “deepfakes,” are becoming increasingly common in mainstream media and complicating the authentication of evidence.
A few weeks ago, artificial intelligence generated a song by music artists, Drake and The Weeknd, that they didn’t sing. And in March, Pope Francis appeared to be wearing a white puffy coat by French luxury fashion house, Balenciaga SA, but the image was also artificially generated.
As “deepfakes” continue to leave an imprint on popular culture, it’s inevitable that they will find their way into the courtroom. AI will become further implicated in claims of identity theft, securities and healthcare fraud, and defamation, to name a few.
Consequently, there’s a growing need to understand how to authenticate what is real for purposes of entering the image or recording into evidence, and toss out what is fake. Or, if the fake image or recording is the basis for the claim, how to enter that into evidence.
Musk Be Real
The issue of authentication and deepfakes arose at the end of April in a wrongful death case against Tesla Inc., when counsel for Tesla refused to admit that the videos of Elon Musk touting the electric car’s Autopilot feature were genuine and accurate for authentication purposes. This didn’t stand well with Santa Clara County Superior Court Judge Evette Pennypacker, who found Tesla’s contentions “deeply troubling.”
In March 2023, the court granted Tesla’s motion for a protective order to ward off the deposition of Musk, pending Tesla’s responses to written discovery, including requests for admission regarding the authenticity of certain statements made by Musk during various interviews or other public engagements. As long as Tesla provided fulsome responses to the written discovery, there would be no need for Musk’s deposition, the court said.
Except, Tesla didn’t provide adequate responses, which the court viewed as Tesla “trying to avoid at all costs tying itself to Mr. Musk’s statements or denying outright that Mr. Musk made the statements.”
Consequently, in its tentative ruling, the court allowed a limited deposition of Musk to confirm that he was at specific interviews and made specific, already-identified statements during those interviews.
Musk is likely not pleased with this development.
Apex depositions of high-level corporate officers, such as a president or Chief Operating Officer, are rare and often used as a last resort. The higher up in an organization’s hierarchy the witness is, the stronger the presumption that the witness is less directly relevant to the evidence proffered in deposition, and the more appropriate the apex doctrine protections become. In order to seek the deposition of a high-level corporate officer, the proponent of the deposition must demonstrate a special need, including identification of issues the apex witness has first-hand knowledge of.
In the case of Musk, the court found that "[i]ronically, Tesla’s refusal to answer these questions only makes a clearer record that Musk is the only person that has this information to respond to this discovery, one of the pre-requisites to permitting an Apex deposition.”
Authentication is best handled first through less expensive and intrusive means.
Cut to the Chase
Whether the case is in state court or federal court, authentication of evidence will almost certainly be part of the pre-trial preparations.
The Federal Rules of Evidence require evidence to be authenticated before it’s admitted. Proving authentication requires a showing that the evidence is what the proponent claims it is. Authentication can be accomplished in a number of ways, with the intent of laying a foundation of genuineness sufficient to support a finding of authenticity by the trier of fact.
Traditionally, requests for admission are an inexpensive and sufficient way to authenticate evidence before trial. Federal rules don’t cap the number of requests for admissions, which can be used to simultaneously establish the authenticity of multiple items without the need for additional discovery or testimony.
If there’s a shared interest in the evidence, parties can also stipulate to the authenticity of the items, without any back and forth through written discovery. For summary judgment purposes, parties may also obtain a declaration from an individual knowledgeable about an item’s authenticity.
These approaches are routinely used for authenticating uncontested evidence. It’s best to establish authentication well in advance of the discovery deadline to avoid any last minute roadblocks precluding the use of the evidence at trial.
Don’t Overcomplicate It
As with the introduction of instant messaging and email correspondence into evidence, there’s no need to construe unique rules of admissibility for artificially-generated images or recordings. The admissibility of the evidence will be evaluated on a case-by-case basis—as with any other document—to make sure there’s a sufficient showing of its relevance and authenticity.
A more detailed approach may need to be taken, though, if the proponent of the evidence is trying to prove or disprove the existence and authenticity of what is claimed to be a “deepfake.”
If the proponent of the evidence believes the image or recording is real, the authenticity of the evidence can be established through direct or circumstantial evidence. Similarly, if the “deepfake” is what led to the lawsuit—like in a case where the plaintiff claims harm from a deepfaked video—the plaintiff has to show that the “deepfake” is not what it purports to be. Lay witness and expert testimony are great sources for establishing what the image or recording should show, and how it is fake in comparison.
If the opponent of the evidence wants to establish that the image or recording is a “deepfake,” such that it should not be admitted into evidence, locating the original image or recording may be a necessary first step. If an original doesn’t exist (or never did), a digital forensic expert can help illustrate the artificial-generation, tampering, and manipulation of the image or recording. Deposition testimony of the individual depicted in the image or recording, or a witness with knowledge, may also be necessary to explain the falsities.
The threshold for authentication is low, and can be established in multiple ways. Conclusive proof is not required, simply evidence sufficient for a finding that the item is what the proponents claims it to be. Once admitted, the credibility of the evidence and the weight given to it are issues for the trier of fact.
As for Musk, he may have to sit for a three-hour deposition, where he’ll be asked a litany of questions about the authenticity of the purportedly “deepfake” videos, including whether he’s the person depicted in the video and whether he was at the specific interviews or public engagements depicted in the videos. The plaintiffs won’t be permitted to ask any questions about the substance of the statements though.
Whether the plaintiffs will be successful at debunking claims that the videos are deepfakes by deposing Musk remains to be seen. They may ultimately have to turn to circumstantial evidence to substantiate the authenticity of the videos. Fortunately for them, the hurdle is low.
Bloomberg Law subscribers can find related content on our Discovery practice page and our Litigation Intelligence Center resource.
If you’re reading this on the Bloomberg Terminal, please run BLAW OUT in order to access the hyperlinked content, or click here to view the web version of this article.
To contact the reporter on this story: