Following on from our recent article on the citation of fake legal authorities in the High Court in The King on the application of Frederick Ayinde v The London Borough of Haringey, the Divisional Court, under its Hamid jurisdiction, has issued a further judgment on that case and Hamad Al-Haroun v Qatar National Bank QPSC & Anor addressing serious concerns arising from the submission of false, AI-generated information in court proceedings.
Background to the Cases and the Hamid Jurisdiction
The Hamid jurisdiction relates to the court’s inherent power to regulate its own procedures and enforce duties owed by lawyers to the court. These two cases were referred to the Divisional Court following the actual or suspected use of generative AI tools to produce written legal arguments and witness statements that were not checked using authoritative sources, resulting in fake citations and quotations being put before the court.
In the Ayinde case Ms. Forey, a pupil barrister, submitted grounds for judicial review that included five non-existent case citations and misstated the effect of a statutory provision. Despite being questioned by the defendant's solicitor, Ms. Forey failed to provide copies of the cited cases and initially dismissed the errors as "cosmetic". Ritchie J, who heard the initial wasted costs application, found Ms. Forey's behaviour, and that of Haringey Law Centre (who she was instructed by), to be improper, unreasonable, and negligent, concluding that Ms. Forey had intentionally put the fake cases into her statement of facts and grounds.
In the Al-Haroun case, Mr Al-Haroun had brought a high value claim for breach of a financing agreement. Both Mr Al-Haroun and his solicitor, Mr Hussain, produced witness statements in support of an application to set aside an order extending time for the defendants to file evidence. Referring the case to the Divisional Court, Dias J noted that, in total, the witness statements referred to forty five authorities, eighteen of which did not exist with many of remainder either not containing the quotations attributed to them or not supporting the propositions for which they were cited. Mr Al-Haroun admitted that he had generated the citations using publicly available AI tools and other online sources. His solicitor, Mr Hussain, admitted that he had simply replied on his client’s research without verifying the cited authorities.
The Court's Findings on AI and Professional Duties
The Court acknowledged that AI is a powerful tool likely to have a continuing and important role in litigation, noting in particular its use in large disclosure exercises in the Business and Property Courts.
However, the Court emphatically stated that freely available generative AI tools trained on large language models, such as ChatGPT, are not capable of conducting reliable legal research and often produce "apparently coherent and plausible responses" that are "entirely incorrect," cite non-existent sources or purport to quote passages that do not appear in the source.
The court re-affirmed the professional duty of both solicitors and barristers to check the accuracy of any research conducted using AI by reference to authoritative sources. This duty applies whether lawyers conduct the research themselves or rely on the work of others.
The judgment reinforced existing regulatory duties for barristers and solicitors likely to be breached by the citation of fake authorities:
- Under the Bar Standard Board Handbook, barristers must observe their duty to the court, act with honesty and integrity, avoid diminishing public trust, and provide a competent standard of work. They are also prohibited from knowingly or recklessly misleading the court and must not draft documents containing contentions that are not properly arguable.
- Solicitors are similarly bound under the SRA Code of Conduct not to mislead the court, only make properly arguable assertions, avoid wasting court time, and provide a competent service. A solicitor also remains accountable for work conducted on their behalf by others.
Sanctions and Outcomes
The judgment details the wide range of powers available to the court in response to the mis-use of AI to generate false or misleading citations. These range from public admonition in judgments (rarely sufficient on its own), wasted costs orders, referral to a regulator for breach of the duties outlined above, to proceedings for contempt of court and referrals to the police in the most egregious cases involving a deliberate intention to deceive the court.
In Ayinde, the Divisional Court found Ms. Forey to have acted negligently and unreasonably, the court decided not to initiate contempt proceedings against her, noting her junior status, previous public criticism, and self-referral to the Bar Standards Board. The court explicitly rejected Ms Foley’s argument that applying the correct legal principles supported by fake cases was not improper conduct (she argued this was akin to mislabelling a tin where the tin’s contents were correct). The court stated that this "entirely misses the point and shows a worrying lack of insight".
The Court referred Ms. Forey to the Bar Standards Board for, amongst other things, consideration of the truthfulness of her explanation of what had happened. Mr. Amadigwe, the supervising solicitor, was also referred to the Solicitors Regulation Authority for inadequate steps taken in response to the false citations and his oversight of Ms. Forey.
In Al-Haroun, the court accepted Mr. Al-Haroun’s apology and lack of intent to mislead, but stressed that his errors did not absolve his solicitor of responsibility. The court found Mr. Hussain's failure to verify the research to be a "lamentable failure" but did not initiate contempt proceedings, as there was no evidence of intent to mislead. Instead, the Court referred Mr Hussain to the SRA.
Analysis
This judgment is a reminder that, while generative AI is a powerful and useful tool in litigation, it must be used with proper human oversight and that verification of AI generated material is essential.
The judgment includes an annex that details fourteen reported instances of lawyers citing fake AI generated authorities in the UK, USA, Canada, Australia, and New Zealand, highlighting that the issue has become increasingly widespread.
Finally, the judgment underlines the severity of penalties that can be imposed by the court for the misuse of AI generated citations and how seriously the court is likely to treat any further instances of this kind of AI misuse. The Court ordered that a copy of the judgment be sent to Bar Council, Law Society and Council of the Inns of Court for them to consider further steps to ensure the existing guidance on the proper use of generative AI is followed.