AI and Deepfakes Complicate Evidence in Workplace Investigations

Feb. 27, 2024, 9:30 AM UTC

The explosion in public access to artificial intelligence tools soon may make employer investigations more challenging if they’re not already.

For example, an employee now can use deepfake audio technology to mimic another employee’s voice to create a scripted, fabricated audio recording of someone using discriminatory or harassing expletives towards others in the workplace.

Fast-improving generative AI applications can be used in a similar way to create realistic images, and even video, of people engaged in almost any activity using a simple text prompt.

Employers regularly make difficult decisions during workplace investigations about individual credibility. Recently, employers have experienced the added complexity of employees who submit electronic evidence to support their allegations or alibis under circumstances that raise questions of authenticity and legitimacy.

For example, an employee who incurs excessive points under an attendance call-in policy may suddenly submit a screenshot of a call history log that shows timely calls were placed to the attendance line all along. That call history may conflict with attendance line records that don’t support such calls. The company may question whether the screenshot is genuine or fabricated.

The “liar’s dividend” refers to an individual questioning the legitimacy of all factual representations out of concern that anything is subject to fabrication through modern AI tools. In the workplace investigation setting, this may lead an employer to feel pressured to support, fortify, and further strengthen every factual matter to the point that it’s unassailable.

The chances are increasing that your next workplace investigation may need to address the authenticity of electronic evidence that could be created by AI.

Several jurisdictions hold that courts don’t sit as “super personnel” committees to review an employer’s business decisions and avoid replacing a defendant’s investigatory judgment. These cases can provide a measure of confidence for an employer’s good-faith investigation and conclusions.

Still, litigation defenses can be substantially stronger through a sound investigation. A prospective plaintiff may forget raising claims altogether when they see a company undertook a thorough investigation.

Here are a few steps to consider using the next time AI emerges as a potential factor in your workplace investigation.

Opportunity to Respond

A traditional investigation often gives the accused a change to address allegations levied against them. The accused may be aware of alibi evidence that can refute the allegations.

This step can be more important where an employee presents electronic evidence that didn’t originate on the company’s equipment. The accused may be able to provide information that questions the legitimacy of the complainant’s support.

Alternative Verification

Investigations of the past often relied on witnesses to verify allegations and credibility. When an employee presents evidence that potentially is the product of external AI programs, alternative sources of information can verify its legitimacy. For example, time clocks and company security cameras may be reviewed to confirm the presence of individuals portrayed on an employee’s personal photos and videos.

Native Materials

Electronically stored information verified by a forensic expert may confirm whether the date and time an electronic file was created matches the date and time of an image, audio recording, or video sequence to support its legitimacy. This strategy is best used after questioning the employee who presents the information about how, when, and where it was created.

Privacy concerns can be mitigated through search limitations that focus on particular files or periods of time to search for relevant information and prevent disclosure of any other information to the company. Employers may consider any limitations on the company’s ability to investigate materials on electronic devices, including the need to obtain consent of the device owner.

Live Verification

Some sources of evidence presented by employees to support their position may be subject to immediate confirmation on an electronic device. For example, a scrollable call history report should be accessible by an employee who alleges a screenshot shows calls were made to the attendance hotline. An individual who maintains the source of evidence may not expect an investigator who asks to call up the information live.

Here, again, limitations on the search of electronic devices may need to be verified before requesting such an opportunity for review. However, an individual’s credibility may be substantially strengthened or weakened by the ability to provide live verification of their electronic evidence.

Current applications of AI present significant opportunities for employers and employees alike. Keeping these tools in mind and incorporating strategies to address them in your next workplace investigation can help strengthen conclusions, give employees confidence in the process, and reduce the chances of ensuing litigation.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Jesse Dill is shareholder in the Milwaukee office of Ogletree Deakins, with focus on FLSA, state wage and hour laws, and labor relations.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Jada Chin at jchin@bloombergindustry.com; Rebecca Baker at rbaker@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.