A federal rule requiring lawyers to certify the accuracy of their filings is gaining new traction in bankruptcy courts, where judges are sanctioning attorneys for submitting documents with fake AI-generated citations.
Courts nationwide are increasingly cracking down on lawyers citing nonexistent cases in filings, often because attorneys misunderstood the technology or failed to verify the information. Although courts have been grappling with how to monitor AI use by legal professionals for years, few incidents have appeared in bankruptcy practice until recently.
Bankruptcy judges in Illinois and South Carolina sanctioned attorneys this summer for submitting filings with hallucinated citations, and an Alabama judge recently ordered a Georgia lawyer to explain why she shouldn’t face discipline over “misleading and fabricated citations.”
At least three US bankruptcy courts—the Southern District of New York, the Northern District of Texas, and the Western District of Oklahoma—have issued orders or rules on AI. Most other bankruptcy courts rely on “Rule 9011,” which requires attorneys to certify that filings are accurate.
Around 40 nonbankruptcy judges have issued AI-related standing or general orders, according to the Rails AI tracker, which practitioners say could cause inconsistency across cases nationwide.
Thomas Nield, a former Semrad Law Firm bankruptcy attorney who was sanctioned in July by the Illinois bankruptcy court, said he only used AI previously to edit emails and for general queries.
Nield, who was terminated in September, used ChatGPT to research an ancillary issue in a Chapter 13 case. However, the court found nonexistent cases and quotes.
“I haven’t used AI to do any legal research since,” Nield said days before his termination. “If I were to use AI again for legal research, it would only be within the confines of a legal research program such as Westlaw. Even then, I would only use it for an overview of the legal issue at hand and to help find original sources, not to write any of the brief.”
Semrad, which previously had no AI policy, adopted official guidelines after the incident, conducted a firmwide audit, withdrew its fee application in the Chapter 13 case, mandated AI training, and offered to reimburse opposing counsel. The managing partner didn’t respond to a comment request.
Judge Michael Slade imposed a $5,500 fine but cautioned: “The next lawyer who does the same thing is warned that he or she will likely see a more significant penalty.”
More Sanctions Follow
The South Carolina bankruptcy court found in August that Paul Held, a solo practitioner with 40 years of experience, cited fake cases in court papers.
He was ordered to do continuing legal education, including AI ethics training.
Held didn’t respond to a comment request. The court said he admitted the citations generated by Microsoft Copilot resulted “from haste and a naive understanding of the technology” and that he was “shocked” to learn AI can produce fake information.
Cassie Preston of Gordon Rees Scully Mansukhani, was recently accused in an Alabama hospital bankruptcy of improperly using AI.
The firm said in a now-withdrawn filing that the claims were unsupported and the alleged fabrications were “minor citation or paraphrasing errors.”
A bankruptcy judge will consider sanctions against Preston next month. Preston and Gordon Rees didn’t respond to a comment request.
“My rule of thumb is you start off with the least sanction necessary to correct the behavior,” said Paul Grimm, a retired Maryland federal district judge.
He said “normative” sanctions deter future violations while “compensatory” sanctions reimburse parties harmed, such as a lawyer who spent hours on nonexistent research.
“AI sounds so authoritative, so of course, you’d think it must be right,” he said. “I hope that five years from now, we won’t be seeing these things happen as frequently as they have been.”
Rule 9011
Legal Decoder CEO Joseph Tiano said there aren’t many AI-related standing orders in bankruptcy courts partly because Rule 9011 already requires attorneys to confirm information they use.
Standing orders can have the unintended consequence of causing confusion when courts define AI differently.
“Some define it broadly, some narrowly,” Tiano said. “In some cases, you have to disclose what tools were used; in others, you don’t.”
With Rule 9011, “there’s abundant precedent and time-tested protocols,” he said. “You know exactly what the state of play is.”
Courts must also determine which approved AI tools can be used while protecting confidentiality and which uses must be avoided or disclosed, as AI use can include simple tasks like grammar checks, explained Nancy Rapoport, a law and ethics professor at the William S. Boyd School of Law.
“That can’t be what the court means,” she said. “When courts say generative AI, they’re getting closer to the real issue.”
If courts decide to adopt guidance, the “fairest” and most uniform approach is a local rule for all courts within a district or circuit, Grimm said. These allow for public notice, comments, and revisions, unlike an individual judge’s order to manage specific issues in their courtroom.
AI guidance from the Southern District of New York and the Northern District of Texas bankruptcy courts directs attorneys to verify accuracy, while the Western District of Oklahoma requires an attestation form for any court document that uses AI.
Other courts mainly address disclosure and accuracy, but a few regulate other uses, including research and evidence generation.
AI Enters the Room
The industry is still exploring AI use and tools, data security, privacy, and billing issues. Bankruptcy associations offer training, and firms are adopting protocols.
AI tools, such as data analytics, to improve the business of delivering legal services are the least risky place to deploy AI, Tiano said, instead of using it directly in the practice of law.
“It’s only recently that continuing legal education has been providing courses relevant to AI usage,” Nield noted.
Rapoport said state regulators could amend continuing legal education requirements to include AI in ethics courses.
“AI will fundamentally change the way we practice law, including staffing needs across experience levels and practice areas,” she said.
To contact the reporter on this story:
To contact the editors responsible for this story: