our insights

AI Shortcuts Cut Short? – New AI Disclosure Rules on Court Submissions

08/16/2023

As the artificial intelligence (AI) revolution pushes ahead, some jurisdictions are putting the brakes on the use of AI in courtrooms. On June 23, 2023, the Court of King’s Bench of Manitoba became the first court in Canada to require that litigants disclose if and how they used AI to prepare materials filed with the court.1 A similar practice direction was issued by the Supreme Court of Yukon just three days later. The introduction of rules related to AI is an emerging trend across Canada and the US and it reflects rising concerns with AI in the legal practice and beyond.2

AI programs such as OpenAI’s ChatGPT, Google Bard, and Microsoft’s Bing present novel tools for legal professionals with plenty of apparent benefits. Such AI programs allow users to harness language processing technologies that may assist in researching, writing, and reviewing documents. However, their use is far from foolproof. AI language models are particularly susceptible to “hallucinations,” which arise when programs generate factually incorrect or nonsensical information that superficially appears plausible. Without careful review by the lawyer using these tools, faulty legal theories or citations may slip through the cracks, and ultimately result in submissions which are confusing, unhelpful and even misleading to the court.

AI hallucinations proved problematic in May of 2023, when a US attorney mistakenly relied on six fictitious cases generated by ChatGPT in his submissions to a New York District Court.3 Since then, two US District Courts in Texas and Illinois, as well as the federal US Court of International Trade, have implemented rules regarding the use and disclosure of AI in legal briefings.4

The practice direction introduced by the Court of King’s Bench of Manitoba stated that:5

  • Although still novel, it is apparent that some litigants will use AI in court submissions.
  • It is currently difficult for courts to prescribe the appropriate use of AI in litigation, but there are nevertheless legitimate concerns about its use.
  • To address these concerns, the Court requires parties to disclose when and how court materials are prepared using AI.

As courts continue to contend with both the challenges and opportunities posed by AI, it is likely that similar practice directions will be issued by courts of all levels across jurisdictions in the near future. Awareness of new rules and practices regarding AI is imperative to keeping litigation efficient, fair, and successful. Cassels will continue to closely follow developments in this area and keep businesses and interested parties informed.

_____________________________

1 Court of King’s Bench of Manitoba, “Re: Use of Artificial Intelligence in Court Submissions”, <practice_direction_-_use_of_artificial_intelligence_in_court_submissions.pdf (manitobacourts.mb.ca)>
2 Supreme Court of Yukon, “Practice Direction, General-29: Use of Artificial Intelligence Tools”, <GENERAL-29 Use of AI.pdf (yukoncourts.ca)>
3 Forbes, “Lawyer Used ChatGPT in Court – And Cited Fake Cases. A Judge Is Considering Sanctions”, <Lawyer Used ChatGPT In Court—And Cited Fake Cases. A Judge Is Considering Sanctions (forbes.com)>
4 The National Law Review, Vol. 13, No. 208, “Will Mandatory Generative AI Use Certifications Become the Norm in Legal Filings?”, <Use Of Generative Artificial Intelligence In Court Fillings (natlawreview.com)>; United States District Court for the Norther District of Illinois, “Standing Order for Civil Cases Before Magistrate Judge Fuentes”, <Standing Order For Civil Cases Before Judge Fuentes rev’d 5-31-23 (002).doc (uscourts.gov)>
5 Court of King’s Bench of Manitoba, supra note 1.

This publication is a general summary of the law. It does not replace legal advice tailored to your specific circumstances.

For more information, please contact the authors of this article or any member of our Litigation Group.