A cautionary tale of using AI in law; UK case finds that AI generated fake case law citations

This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.
The First-Tier Tax Tribunal in England ('FTT') has found as a fact that a respondent cited cased law that did not exist and instead ‘have been generated by an AI system such as ChatGPT’.
In summary:
"Many harms flow from the submission of fake [judgments]. The opposing party wastes time and money in exposing the deception. The Court's time is taken from other important endeavors. The client may be deprived of arguments based on authentic judicial precedents. There is potential harm to the reputation of judges and courts whose names are falsely invoked as authors of the bogus opinions and to the reputation of a party attributed with fictional conduct. It promotes cynicism about the legal profession and the…judicial system. And a future litigant may be tempted to defy a judicial ruling by disingenuously claiming doubt about its authenticity."
The risks of using AI in law are known and under the spotlight. The Solicitors' Regulation Authority ("SRA") recently said this about results obtained from AI systems:
"All computers can make mistakes. AI language models such as ChatGPT, however, can be more prone to this. That is because they work by anticipating the text that should follow the input they are given, but do not have a concept of 'reality'. The result is known as 'hallucination', where a system produces highly plausible but incorrect results."
The case emphasises the need for caution and diligence on output from AI systems, esppecially when used in higher-risk contexts such as legal proceedings.
If you would like to discuss how current or future regulations impact what you do with AI, please contact Tom Whittaker, Brian Wong, David Varney, Lucy Pegler, Martin Cook or any other member in our Technology team.
The case is Harber v Commissioners for His Majesty’s Revenue and Customs [2023] UKFTT 1007 (TC) (available here on BAILII). A detailed write up of the case is here:
In a written document ("the Response") Mrs Harber provided the Tribunal with the names, dates and summaries of nine First-tier Tribunal ("FTT") decisions in which the appellant had been successful in showing that a reasonable excuse existed. However, none of those authorities were genuine; they had instead been generated by artificial intelligence ("AI"). ... We accepted that Mrs Harber had been unaware that the AI cases were not genuine and that she did not know how to check their validity by using the FTT website or other legal websites.