- Advocates push for regulation ahead of 2024 elections
- Debates over election agency’s jurisdiction threaten feasibility
As the usage of “deepfakes” skyrockets, federal regulators are wrestling with how to rein in the new technology using election laws written for life prior to artificial intelligence.
The hyper-realistic, AI-generated audio and visual content enables users to manufacture other people’s statements and actions, creating events that never took place. To some, the new creative power will perpetuate misinformation—especially when used in politics.
“It’s going to be very difficult to try to discern reality,” said Craig Holman, a government affairs lobbyist for the non-profit Public Citizen.
While nine states have enacted laws regulating deepfakes, the feasibility of federal legislation is still up in the air. Public Citizen submitted a new petition to the Federal Election Commission last week urging the agency to reconsider regulating after a 3-3 deadlock in June halted the group’s initial effort. If the commission again shrugs off the request, more pressure will pile on Congress to act.
Debates about the agency’s power to police the technology come as political groups take advantage of deepfakes ahead of the 2024 presidential elections. Florida Gov.
“That one stands out because that was done by a mainstream political candidate,” Andrew Grotto, an international security fellow at Stanford University’s Cyber Policy Center, said about the ad.
Grotto has watched deepfakes, which he likens to “Photoshop on steroids,” go from an expensive technology to a democratized tool. What was once niche is now widespread—leaving officials to grapple with what to do about it.
Muddled Jurisdiction
When the FEC commissioners deadlocked along party lines in June, they not only shot down Public Citizen’s petition, but also blocked the agency from hearing public comment.
Declining to issue a “Notice of Availability” was highly irregular, according to Public Citizen’s July 13 petition. Republican commissioner Allen Dickerson agreed.
“I recognize that’s a very unusual decision, and it’s a first for me personally,” Dickerson said during the June meeting.
But Dickerson stressed FEC’s narrow jurisdiction under campaign finance laws, nodding at Congress to pick up the reigns.
The commission’s Democratic chairwoman, Dara Lindenbaum, also expressed skepticism about the scope of the FEC’s authority despite voting in support to hear public comment.
To Holman, of Public Citizen, the deadlock was unexpected.
“It was breathtaking,” said Holman, who wrote both petitions to the FEC alongside Public Citizen President Robert Weissman.
Holman and Weissman brought the request before the agency precisely because they thought it had proper, though narrow, jurisdiction.
Specifically, they pointed to existing statutory provisions that prohibit “fraudulent misrepresentation,” when federal candidates, their campaigns, or their agents falsely represent themselves as “speaking, writing, or otherwise acting for or on behalf of any other candidate or political party or employee or agent.”
“That’s exactly what artificial intelligence has been used to do in some of these ads,” Holman told Bloomberg Law. “It’s the exact same phenomenon. It’s just much more highly developed and much more dangerous at this point.”
The FEC declined to comment.
Fraudulent Misrepresentation
The provisions against “fraudulent misrepresentation” come from the 1971 Federal Election Campaign Act, the primary US federal campaign law that the FEC was created to enforce. The statute focuses on regulating the money that flows in and out of US Senate, House, and presidential campaigns.
Public Citizen has asked commissioners to specify in guidance and amended rules that penalties would apply if candidates or their agents fraudulently misrepresent other candidates or political parties through false AI-generated content.
This is a narrow route to regulation. The section of the law regulating “fraudulent misrepresentation” applies only to federal candidates, not super PACs, according to Kate Belinski, a partner at Ballard Spahr who focuses on political and election law.
Super PACs, unlike candidates’ campaigns, can raise unlimited sums of money from organizations and individuals alike. Then, they can spend without restriction to indirectly promote candidates and advocate against others—often through political ads.
FEC regulation of deepfakes, under this provision, could exclude super PAC usage, Belinski said.
She added that candidate campaigns are generally less inclined to use deepfakes because it reflects badly on them.
“They often will try to stay more pure and leave the dirty work to the super PACS,” she said. “It’s more likely that a super PAC or non-connected PAC would want to do some of devious deepfake anyway.”
Some observers are wary of leaving the impression that the FEC is the catch-all agency for deepfake regulation.
The commission tracks campaign finance data, sets contribution limits, and supervises public funding in presidential elections.
“The FEC doesn’t get into the content of political ads, other than requiring disclosure,” Grotto, of Stanford, said. “It’s not about whether or not the content of the advertisement is true or false.”
Grotto said he doesn’t see the agency pursuing regulations of deepfakes under “fraudulent misrepresentation” because that area of jurisdiction is “relatively untested” given the thorny questions it raises.
Uncharted Territory
Determining what is true and false gets into the gray areas of campaign regulations.
Though the Federal Trade Commission regulates truth in commercial ads, neither the FEC nor the Federal Communications Commission does the same for political advertising. According to the FCC, it generally doesn’t “ensure the accuracy of statements that are made by candidates.”
“We really don’t have laws in this country about telling lies, or truth-in-advertising laws,” said Travis Ridout, a director of the Wesleyan Media Project.
When the project started tracking federal election advertising more than 10 years ago, Ridout estimated there were 3,000 or 4,000 ads in circulation. Now, he said there are millions, some targeted to niche audiences and many on streaming services or other digital platforms.
“It’s a very difficult environment to police,” Ridout said.
For First Amendment groups, concrete government intervention could harm the public’s free speech rights.
“If a candidate says ‘this person has done a terrible thing’ often those are statements of just belief,” said Bradley Smith, chairman of the Institute for Free Speech. “It may be wrong, there may be something that most of us would objectively look at and say is wrong, but if a person believes it is right, we should be leery about trying to have the government step in and be an arbiter of the truth.”
The Institute for Free Speech has argued that activist efforts to expand the FEC’s political speech regulations would be “harmful and unwise.”
Still, Smith, a former FEC Republican commissioner from 2000 to 2005, distinguished between deepfakes and other issues threatening free speech.
“It’s sort of a fraud,” Smith said about deepfakes. “It’s problematic in a way that goes outside of the normal free speech concerns that campaigns should be able to say whatever they want.”
Smith said there’s a case to be made that politicians would have to disclose if their ads include deepfakes, but he’s not sure the FEC has any authority beyond the certain minimal disclosures it requires of candidates under federal election law.
The Role of Congress
The day Public Citizen filed its second petition to the FEC, 50 Democratic lawmakers also wrote to the agency calling for action.
Led by Rep.
“As the 2024 Presidential election quickly approaches, it is imperative that the FEC allow comment on Public Citizen’s petition for rulemaking,” the Democrats wrote.
As a regulatory agency, the FEC remains reliant on “green-lights” from Congress to determine the bounds of its jurisdiction.
“I think the onus really is probably on Congress to at least say, ‘Someone please regulate this,’ or to pass some laws that regulate this,” Ridout at the Wesleyan Media Project said.
There has also been appetite in Congress to pursue more comprehensive legislation to curb deepfakes, or at least increase transparency, in and out of politics.
Sens.
Still, experts said partisan gridlock could stop legislation in its tracks. As many lawmakers mount reelection campaigns, there also could be benefits to delaying concrete regulation.
“Maybe after some politicians are burned by this, they might have more appetite,” Ridout said. “But they may also feel they can use it against their opponents.”
Politicians, he said, “can oftentimes act in their own self interests or their own reelection goals.”
To contact the reporter on this story:
To contact the editors responsible for this story: