<img src="//bat.bing.com/action/0?ti=5189112&amp;Ver=2" height="0" width="0" style="display:none; visibility: hidden;">

    The Lawletter Blog

    CIVIL PROCEDURE:   Artificial Intelligence and Court Opinions

    Posted by Brett R. Turner on Fri, May 1, 2026 @ 09:05 AM

    The Lawletter Vol. 51 No. 2

    Brett Turner—Co-President

          Much attention has been paid in recent months to the misuse of artificial intelligence (“AI”) software by attorneys. But what about the use of AI software by judges?

          In Payne v. State, No. S26A0459, the Georgia Supreme Court discovered that a trial court order dated September, 12, 2005, contained serious citation errors. Three cases cited in the order did not actually exist; three cases were properly cited, but language which was expressly quoted did not appear in the opinion; and three cases were properly cited, but their holdings were clearly misstated. These sorts of errors are common when AI is used to prepare a document explaining legal reasoning without sufficient human supervision.

         The order stated that it was prepared by counsel for the State. Counsel was questioned closely at oral argument, and directed to submit an affidavit providing all information known to her as to how the trial court’s order came to include these citation errors. At least some of the errors were also contained in a series of reply briefs filed by the State in the trial court. Counsel for the State was also ordered to provide copies of the State’s proposed order, and all communications with the court regarding the order.

         This overview is based upon a report by Anna Bowers found here. The report contains a copy of the Georgia Supreme Court’s order and a video of oral argument. There are also local media reports indicating that the state’s attorney apologized to the court for the citation errors, which resulted from misuse of AI. See, e.g., https://www.fox5atlanta.com/news/clayton-prosecutor-punished-using-ai-court-filings-citing-fake-cases; https://www.atlantanewsfirst.com/2026/03/30/attorney-with-clayton-county-das-office-apologizes-using-ai-citing-fake-cases-court-brief/.

          The outcome of the Payne case remains to be seen. But a series of lessons already present themselves. First, counsel should double-check all legal citations for this sort of error, regardless of where the citations are found. The fact that a citation appears in a court order does not necessarily mean that no citation error is present. The order may be based upon citations submitted by counsel, as appears to have been the case in Payne; and it is far from impossible that the court itself may be responsible. Judges, no less than lawyers, need to be careful to verify all citations suggested by AI.

          Second, citations need to checked for error at the time they are submitted, and not weeks or months afterward. It is quite striking that the serious citation issues in Payne were not discovered during post-judgment proceedings in the trial court, and were not discovered in the Georgia Court of Appeals. Only the Georgia Supreme Court discovered the errors. It is scary to think that errors of similar magnitude may well have gone undiscovered throughout the litigation process.

         In other words, when litigating a case, and especially when appealing an unfavorable order, citation-checking should be among the first steps taken after an order is issued. Even if there is no time to check the substantive accuracy of citations, it should at least be possible to determine whether the case cited actually exists at the volume and page of the reporter indicated. Indeed, many major citation services provide a product which pulls the case citations out of a legal document and performs basic validation, such as checking whether the case actually exists and whether it has been expressly overruled. These services are not 100% reliable, because cases can be overruled by class or by implication. But they can at least be trusted to find out whether a cited case actually exists.

         Third, it can actually be quite difficult to avoid the normal consequences of citing false authority to a court. Even if the trial judge and opposing counsel have a tendency not to double check citations, citations of false authority may be discovered on appeal. For a false citation to escape notice, it must often survive multiple levels of review.

         This fact has an important effect upon the cost-benefit analysis involved when AI is used to obtain citations. The chance of submitting n briefs containing false authority is the chance of not getting caught on one brief, raised to the nth power. If the chance of getting caught is even five percent—that is, if the chance of avoiding detection on any one brief containing false authority is 95%—the chance of avoiding detection on 100 independent briefs is 95%, raised to the 100th power. That number is very small—just under 0.6%. And most trial attorneys file 100 briefs within their first few years of practice. Thus, over the long term, if an attorney regularly uses authority produced by AI without checking the citations, the chances of getting caught eventually are very close to 100%.

         That fact should have a powerful effect upon the citation-checking process. It is increasingly well understood that AI, left unsupervised, tends to hallucinate. The legal world is slowly gaining experience in discovering hallucinations, and even a small chance of discovery quickly multiplies as the number of times AI is used increases. Against this background, to use AI regularly without checking citations is essentially to guarantee discovery. The only effective strategy for long-term survival is either to carefully double check all citations produced by AI or to refrain from using AI at all.

    New Call-to-action
    Free Hour of Legal Research  for New Clients

    Subscribe to the Lawletter

    Seven ways outsourcing your legal research can empower your practice

    Latest Posts