ChatGPT

AI Hallucination Cases: Australia

This page provides links to Australian legal cases involving Artificial Intelligence (AI) “hallucinations”, documenting instances where Generative AI tools have produced false legal content, including fabricated case citations, legislation, judicial quotes, or misrepresented legal propositions. These lists are continuously updated and are non-exhaustive:

Generative AI (Gen AI) may take the form of large language model (LLM) programs such as OpenAI’s ChatGPT, xAI Grok, Meta Llama, Anthropic’s Claude, Google Gemini, and Microsoft Copilot, and may also include law-specific programs like Lexis+ AI, Westlaw Precision (Australia), and LEAP Legal AI.1

The Supreme Court of Queensland has issued a revised guideline dated 15 September 2025: The Use of Generative AI: Guidelines for Judicial Officers, which has been adopted by the Supreme Court, District Court, and Magistrates Courts. See also Artificial Intelligence: Guidelines for Responsible Use by Non-Lawyers (Queensland Courts).2

There are also instances where AI hallucinations lead to false allegations against individuals. For example, Microsoft Copilot fictitiously reported that a German journalist had confessed to a serious crime.3 The AI tool described him as “an escapee from a psychiatric institution, a con-man who preyed on widowers, a drug dealer and a violent criminal”.4 But in reality, these were AI “hallucinations” based on court cases the journalist had written about. Similarly in Australia, a Victorian mayor was falsely described by OpenAI’s ChatGPT as having been charged with serious criminal offences, when in fact he was a whistleblower.5

Sources include the Supreme Court Library of Queensland, AustLII and Westlaw Australia. See also the AI Hallucination Cases Database by Damien Charlotin, which tracks decisions from around the world and the AI Hallucination Cases Tracker (UK) by Mathew Lee.

Queensland:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Queensland Industrial Relations CommissionCarrington v TAFE Queensland [2025] QIRC 3405 December 2025The Agent for the Appellant provided references of case authorities relied upon in reply submissions for each ground of appeal: at [125].6 O’Neill IC said at [132]: “It appears that the citation of those matters may result from the use of an Artificial Intelligence search engine. I repeat the warning for litigants I provided in Goodchild v
State of Queensland (Queensland Health)
7 of the danger in relying on artificial intelligence search engines when preparing submissions to be filed in a Court or Tribunal”.8 O’Neill IC advised at [133]: “If parties intend to rely upon such sources of information, it is important that they verify that the case authorities provided by the search engine are actually genuine decisions prior to filing their submissions”.9 No reliance was given to the hallucinated authorities in the matter.10
Supreme Court of QueenslandKhoury v Kooij [2025] QSC 2173 September 2025The litigant in person’s (Applicant) use of Gen AI: cited false names and cases, fabricated quotes in written submissions; and referred to a non-existent legislative sub-paragraph: at [15]–[17].11
Queensland Industrial Relations Commission Ivins v KMA Consulting Engineers Pty Ltd & Ors [2025] QIRC 1412 June 2025The litigant in person’s (Complainant) use of AI which produced fabricated case law in written submissions: at [48], [77]–[78].12
Queensland Civil and Administrative Tribunal (QCAT)Chief Executive, Department of Justice v Wise and Wise
Real Estate Pty Ltd & Anor
[2025] QCAT 222
13 May 2025The self-represented litigants’ (Respondents) use of false case citations and reliance on four non-existent cases as authorities in submissions: at [52]–55].13
Queensland Civil and Administrative Tribunal (QCAT)LJY v Occupational Therapy Board of Australia [2025]
QCAT 96
26 March 2025The self-represented litigant (Applicant) made a number of submissions in support of a Stay Application (ChatGPT): non-existent case citations used as authorities in submissions as support for arguments: at [18]–[19], [21]–[22]. J Dann said at [23]: “Queensland Courts have issued Guidelines for the Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers. … These guidelines apply in the Tribunal”.14
Queensland Industrial Relations CommissionGoodchild v State of Queensland (Queensland Health) [2025] QIRC 4613 February 2025The self-represented litigant’s (Applicant) use via Internet Search Engines or AI of five non-existent decisions from Fair Work Commission as authorities: at [29], [36]–[39].15 O’Neill IC said at [36]: “… Utilising a number of legal search resources and the Fair Work Commission website, no decisions as identified by the Applicant could be located, either by party name or file number”. O’Neill IC observed at [39]: “This appears to be a salutary lesson for litigants in the dangers of relying on general
search engines on the internet or artificial intelligence when preparing legal documents”. O’Neill IC gave no weight to the authorities cited: at [39].
Queensland Industrial Relations CommissionSP v RB as Trustee for the R and R Family
Trust AND Others (No. 5)
[2025] QIRC 016
14 January 2025The litigant in person’s (Respondent) use of AI in final limb of application for an adjournment: at [8].16

Victoria:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Victorian Civil and Administrative Tribunal (VCAT)A’Vard v Mornington Peninsula SC [2025] VCAT 1035 8 December 2025The self-represented litigant (Applicant) advised they used artificial intelligence to gather cases to present as part of the submission.17 The Tribunal proceeded on the basis that the citations are incorrect.18
Supreme Court of Victoria — Court of AppealStewart v Good Shepherd Australia New Zealand [2025] VSCA 20629 August 2025The litigant in person’s (Applicant) use of AI in written submissions with fabricated authorities and citations; arguments in places supported by reference to non-existent authorities; and by reference to passages in real authorities that bore no relationship to the argument: at [62]–[63].19
County Court of VictoriaWang v Moutidis [2025] VCC 115618 August 2025The litigant in person’s (Defendant) use of Gen AI in written submissions resulted in hallucinations including a non-existent VCAT decision and a legally incorrect proposition: at [15]. The submission cites a purported decision, fabricated by Gen AI: at [15].20
Supreme Court of VictoriaDirector of Public Prosecutions v GR [2025] VSC 49014 August 2025The Defendant’s Senior Counsel filed AI generated submissions in this murder case before the Court. AI generated hallucinations included non-existent cases, fictitious quotations (from said to be parts of the Second Reading Speech and the Commission’s Report), and fabricated case references (see G. Use of artificial intelligence, [61]–[80]).21 Senior Counsel admitted filing submissions were wrong and further noted that the cases referred to were incorrectly cited and did not apply to this matter: at [66].22 Elliott J said at [73]: “The pervasiveness of potentially misleading information caused by the use of artificial intelligence did not end there. The revised submissions filed the afternoon before the hearing were not properly reviewed by defence or prosecution counsel. … [The] revised submissions referred to legislation that did not exist, and also a provision in the Act that was said to have been inserted and then repealed, which in
fact never occurred and which provision never existed”.23
Supreme Court of Victoria — Court of AppealNikolic & Anor v Nationwide News Pty Ltd & Anor [2025] VSCA 11223 May 2025The litigants’ (Plaintiffs) purported use of two decisions that are non-existent in cost submissions.24 Beach JA said at [39]: “They are most probably hallucinations of the kind referred to in paragraph 7(a) of the New South Wales Supreme Court Practice Note SC Gen 23, which deals with the use of generative artificial intelligence (Gen AI) in that Court”.25
Victorian Civil and Administrative Tribunal (VCAT)Bangholme Investments Pty Ltd v Greater Dandenong CC [2025] VCAT 2903 April 2025The litigant in person’s use of AI generated material to determine Tribunal processes, with AI results plainly incorrect: at [14]–[15].26
Supreme Court of Victoria — Court of AppealKaur v RMIT [2024] VSCA 264 11 November 2024The litigant in person’s (Applicant) application for leave to appeal, provided several documents that appeared to have been drafted with the assistance of a large language model artificial intelligence (LLM AI), such as ChatGPT. They contained some case citations to cases that do not exist: at [26].27

New South Wales:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Supreme Court of New South WalesIn the matter of Bayfoyle Pty Ltd [2025] NSWSC 160723 December 2025Black J accepted that the solicitor cited several non-existent (or ‘AI hallucinated’) cases in submissions, likely generated by artificial intelligence, breaching duties under Practice Note SC Gen 23 and rule 19 of the Legal Profession Uniform Law Australian Solicitors’ Conduct Rules, but found it had no real impact on the hearing: at [43]. Black J declined to refer the matter to the Legal Services Commissioner, leaving the option open to the opposing party: at [43].
NSW Civil and Administrative Tribunal (NCAT)Huang v Champion Homes Sales Pty Ltd [2025] NSWCATAP 27130 October 2025The self-represented Appellant submitted Amended Grounds of Appeal. The Tribunal found that she had used generative artificial intelligence (Gen AI) in their preparation, as evident from the document and conceded by her: at [123].28 The Tribunal observed that paragraph 16 of NCAT Procedural Direction 7 (Use of Generative Artificial Intelligence requires that, where generative artificial intelligence (Gen AI) has been used in preparing submissions or summaries, parties must include in the body of the document a verification that all cited authorities, case law, legislation, and references exist, are accurately reproduced, and are relevant to the proceedings: at [124].29
NSW Industrial Relations CommissionHowe v Secretary, New South Wales Department of Education [2025] NSWIRComm 108117 September 2025Litigant in person’s (Applicant) alleged use of Generative AI for submissions prepared. The third case cited could not be located by the Commission: at [26].30 The Respondent speculated, based on a similar process of searching for the cases, that the Applicant’s submissions were prepared using generative artificial intelligence: at [27].31
District Court of New South WalesGribble v ESSENTIAL ENERGY trading as Essential Energy [2025] NSWDC 34429 August 2025The self-represented Plaintiff’s pleadings shows clear evidence of using Generative Artificial Intelligence (Gen AI): at [33].32 The Plaintiff conceded that he had used Gen AI: at [38].33 Gibson DCJ expressed concern in relation to how courts should handle such AI-generated ‘hallucinations’ and self-represented litigants: at [39]–[44] .34
NSW Civil and Administrative Tribunal (NCAT)Meniscus Pty Ltd ATF The Meniscus Trust v Chief Commissioner of State Revenue [2025] NSWCATAD 20921 August 2025The litigant in person (Applicant) stated that he had used Generative Artificial Intelligence (Gen AI) in the preparation of his submissions (Google Gemini and ChatGPT): at [37].35 The Tribunal has, however, placed more weight on the records included in the Applicant’s bundles, rather than the submissions which were prepared by the Applicant with the use of Gen AI: at [40].36
Supreme Court of New South Wales — Court of AppealMay v Costaras [2025] NSWCA 1788 August 2025The self-represented litigant’s (Respondent) use of Generative AI in preparing oral submissions, which included a list of authorities, a non-existent case and others which although they existed had little to do with the legal issues in this case: at [3]–[17], [49].37
NSW Civil and Administrative Tribunal (NCAT)HFI v Commissioner of Police, NSW Police Force [2025] NSWCATAD 17117 July 2025The litigant in person (Applicant) stated that she had drafted the content of her statements and then uploaded the documents to ChatGPT to improve them: at [40].38 The Applicant also used ChatGPT to prepare supplementary submissions including case references: at [40].39 J Smith said at [41]: ”Clause 14 of NCAT Practice Direction 7 – Use of Generative Artificial Intelligence states that Generative Artificial Intelligence must not be used for the purpose of altering, embellishing, strengthening or diluting or otherwise rephrasing a witness’s evidence when expressed in written form.”40
District Court of New South WalesBottrill v Graham & Anor (No 2) [2025] NSWDC 221
20 June 2025
The use of Gen AI by the Second Defendant where submissions contained errors due to the use of artificial intelligence sources: at [11].41 The submissions consisted largely of inaccurate legal principles as a result of having drafted them by using artificial intelligence programs: at [14].42

Federal:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Federal Circuit and Family Court of Australia
(Division 1) — Appellate Division
Tekla & Tekla [2025] FedCFamC1A 24523 December 2025The litigant in person (Appellant) submitted a Summary of Argument that was created using generative artificial intelligence: (a) Citing a non-existent case (purported to be a 2004 FCoA case), and misstating the test from House v The King  (1936) 55 CLR 499 (unreasonableness as a test of ‘fairness’); (b) Incorrect citation in support of family violence relevance, despite the case rejecting such claims ([46]); (c) Relying upon two cases, with a 2016 citation and a 2007 citation, which do not exist; and (d) Relying upon an incorrect case citation, claiming that it stands for the proposition which does not appear in the judgment: at [14]. Riethmuller J did not include the alleged citations of the non-existent cases to avoid search engines indexing, potentially misleading others: at [15].
Federal Circuit and Family Court of Australia (Division 2)Pasuengos v Minister for Immigration and Citizenship [2025] FedCFamC2G 212922 December 2025The Applicant’s legal counsel filed an outline of written submissions referring to a number of authorities that do not exist (26 May 2025): [27]–[28]. Counsel conceded that the cited authorities did not exist and were unable to be provided: [30]. On behalf of the applicant’s solicitors, counsel admitted that the submissions had been generated using artificial intelligence and that the solicitors had failed to verify the cited cases: at [30]. Gerrard J said at [31]: “The issues arising from the representative’s ill-conceived reliance on AI hallucinated authorities will be considered in a separate judgment”.
Federal Circuit and Family Court of Australia
(Division 1) — Appellate Division
Rathi & Rathi [2025] FedCFamC1A 23822 December 2025Williams J rejected the litigant in person’s (Appellant’s) contention that s 97(3) or s 43(c) of the Act supports her submission, because it plainly does not, because there is no s 43(c): at [40]. The reliance on those two sections of the Act in submissions clearly demonstrates the appellant has relied on generative Artificial Intelligence (AI): at [40]. As the appeal wholly lacked in merit, it was dismissed: at [70].
Federal Circuit and Family Court of Australia (Division 2)Hugo v Affinity Education Group Pty Ltd [2025] FedCFamC2G 1536 18 September 2025The litigant in person’s (Applicant) use of AI in submissions. The Applicant refers to and cited non-existent authorities.43 Liveris J considered the circumstances similar to Finch v The Heat Group. 44 Liveris J said at [70]: “The court has repeatedly and increasingly emphasised the cautions required in the use of artificial intelligence in court proceedings by legal practitioners and parties, particularly where the use of artificial intelligence is not disclosed, material produced through artificial intelligence is not verified and no certification as to accuracy is given”.45
Federal Circuit and Family Court of Australia (Division 1)Helmold v Mariya (No 2) [2025] FedCFamC1A 16312 September 2025The litigant in person’s (Appellant) use of Generative Artificial Intelligence (AI) in Family Law proceedings. The Appellant deployed Generative AI to prepare his written documents (Notice of Appeal and Summary of Argument) citing fictitious cases: at [5]–[6].46
Federal Circuit and Family Court of Australia (Division 2)Re Dayal (2024) 386 FLR 35927 August 2024The solicitor tendered to the court a list and summary of legal authorities that do not exist.47 The solicitor informed the court the list and summary were prepared using an artificial intelligence (“AI”) tool incorporated in the legal practice management software he subscribes to (LEAP): at [1].48 The solicitor acknowledges he did not verify the accuracy of the information generated by the research tool before submitting it to the court.49 The Court held that generative AI does not relieve the responsible legal practitioner of the need to exercise judgment and professional skill in reviewing the final product to be provided to the Court: at [15].50 A referral was made to the VLSBC: at [21].51
Federal Court of Australia — General DivisionJML Rose Pty Ltd v Jorgensen (No 3) (2025) FCA 97619 August 2025The self-represented litigant (First Respondent) used a form of generative artificial intelligence (AI), to assist with his written and oral submissions for an Annulment Application.52 Wheatley J said at [7]: ”Many of the case citations were inaccurate. Some of the purported quoted passages did not exist. Such matters are likely the product of “hallucinations”. There has been an approach, which I will adopt, of redacting false case citations so that such information is not further propagated by AI systems”.53
Federal Circuit and Family Court of Australia (Division 2)JNE24 v Minister for Immigration and Citizenship [2025] FedCFamC2G 131415 August 2025The Applicant’s lawyer filed written submissions with the Court that contained four case citations of authorities that do not exist or do not stand as authority for the proposition: at [1], [26].54 The Applicant’s lawyer filed an affidavit which advised that he had relied upon Claude AI ”… [A]s a research tool to identify potentially relevant authorities and to improve my legal arguments and position”” at [14].55 The Applicant’s lawyer said that he then used another AI tool, Microsoft Copilot, to validate the submissions: at [14].56 The Court relied on Valu (No 2),57 where Skaros J determined that filing an application and submissions which contained,inter alia, fictitious citations, was conduct which fell short of the legal representatives’ duties to both their client and the Court: at [18].58
Federal Circuit and Family Court of Australia (Division 2)Valu v Minister for Immigration and Multicultural Affairs (No 2) (2025) 386 FLR 36531 January 2025The Applicant’s Legal Representative (ALR) filed submissions which contained citations to Federal Court of Australia cases which do not exist and alleged quotes from the Tribunal which did not exist (hallucinations).59 The ALR stated that he had used AI to identify Australian cases, but it provided him with non-existent case law (at [10]).60 Skaros J held (at [37]–[38]): Given the strong public interest in referring this sort of conduct to the regulatory authority in NSW, and given the increased use of generative AI tools by
legal practitioners, the ALR be referred to the OLSC.61

South Australia:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Supreme Court of South AustraliaRowe v National Australia Bank Ltd [2025] SASC 5017 April 2025The litigants in persons’ (Applicants) use of AI in written and oral submissions: at [37].62 The Applicants relied on three High Court decisions and a NSW Court of Appeal decision.63 These decisions do not exist and are likely AI hallucinations (B Doyle J).64
Supreme Court of South AustraliaHanna v Flinders University [2025] SASC 629 January 2025The Appellant sought to rely on an authority in which the citation they provided did not align with a judgment from any Australian court.65 Having conducted further enquiries, Hughes J was satisfied at [68]: “… [T]he authority cited was generated by artificial intelligence and does not correlate with a decision of an Australian court. I do not consider that the appellant has misled the Court deliberately but has mistakenly assumed that her internet searches would yield accurate results”.66

Tasmania:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Supreme Court of TasmaniaLakaev v McConkey [2024] TASSC 3512 July 2024The self-represented litigant (Appellant) referred in their submissions, the High Court’s decision in De L v Director-General, NSW Department of Community Services (1996) 187 CLR 640 (‘De L’) ([55]).67 The Appellant’s commentary on De L suggested that it involved an appeal challenging the veracity of a witness’s evidence.68 However, De L concerned international child abduction law and was unrelated to issues of false evidence.69 The Appellant relied on a case that appears to not exist, although there is an unrelated case bearing the same citation.70 Blow CJ warned of the risk of using AI, stating at [54]: “… When artificial intelligence is used to generate submissions for use in court proceedings, there is a risk that the submissions that are produced will be affected by a phenomenon known as ‘hallucination’”. 71 The appeal was dismissed.

Australian Capital Territory:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Supreme Court of the Australian Capital TerritoryDPP v Khan [2024] ACTSC 197 February 2024The offender tendered character reference that appeared to be generated by or rewritten with the assistance of a LLP, such as ChatGPT. 72 Mossop J said at [43]: “In my view, it is clearly inappropriate that personal references used in sentencing proceedings are generated by, or with the assistance of, large language models as, if they are not objected to on that basis, it becomes difficult for the court to work out what, if any, weight can be placed upon the facts and opinions set out in them”.73

Western Australia:

COURT/TRIBUNALCASE CITATIONDECISION DATEAI HALLUCINATION
Supreme Court of Western Australia — Court of AppealNash v Director of Public Prosecutions (WA) [2023] WASCA 758 May 2023In this appeal against conviction, the self-represented litigant (Appellant) prepared the case himself, with possible assistance in later submissions (including, perhaps, from an AI program like ChatGPT).74 Quinlan CJ said at [9]: “Neither form of submission made coherent submissions as to why the trial judge’s decision was affected by material error or otherwise gave rise to a miscarriage of justice. Nor did the material sought to be adduced by … [the appellant] as additional evidence on the appeal disclose any miscarriage of justice”.75 Leave to appeal refused, appeal dismissed.76

Footnotes:

  1. Supreme Court of New South Wales, Practice Note SC Gen 23 – Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf> (‘Practice Note SC Gen 23‘). ↩︎
  2. Supreme Court of Queensland, The Use of Generative AI: Guidelines for Judicial Officers (Guidelines, 15 September 2025) <https://www.courts.qld.gov.au/the-use-of-generative-ai-guidelines-for-judicial-officers.pdf>; see also Queensland Courts, Artificial Intelligence: Guidelines for Responsible Use by Non-Lawyers (Guidelines, 15 September 2025) <https://www.courts.qld.gov.au/Artificial-Intelligence_Guidelines-for-Non-Lawyers.pdf>. ↩︎
  3. Anna Kelsey-Sugg and Damien Carrick, ‘AI Hallucinations Caused Artificial Intelligence to Falsely Describe These People as Criminals’, ABC News (online, 4 November 2024) <https://www.abc.net.au/news/2024-11-04/ai-artificial-intelligence-hallucinations-defamation-chatgpt/104518612>. ↩︎
  4. Ibid. ↩︎
  5. Laura Mayers, Stephen Martin and Debbie Rybicki, ‘Hepburn Mayor May Sue OpenAI for Defamation Over False ChatGPT Claims’, ABC News (online, 6 April 2023) <https://www.abc.net.au/news/2023-04-06/hepburn-mayor-flags-legal-action-over-false-chatgpt-claims/102195610>. In this 2023 instance, a Victorian councillor considered initiating a defamation action against OpenAI, the creator of ChatGPT, after the AI tool falsely described the mayor as a guilty party in a bribery scandal instead of identifying him as the whistleblower. ↩︎
  6. Carrington v TAFE Queensland [2025] QIRC 340, [125] (O’Neill IC) (‘Carrington‘). ↩︎
  7. Goodchild v State of Queensland (Queensland Health) [2025] QIRC 46 (O’Neill IC) (‘Goodchild‘). ↩︎
  8. Carrington (n 6) [132]. ↩︎
  9. Ibid [133]. ↩︎
  10. Ibid [131]. ↩︎
  11. Khoury v Kooij [2025] QSC 217, 5 [15]–[17] (Martin SJA). ↩︎
  12. Ivins v KMA Consulting Engineers Pty Ltd & Ors [2025] QIRC 141, 14 [48], 20 [77]–[78]. ↩︎
  13. Chief Executive, Department of Justice v Wise and Wise Real Estate Pty Ltd & Anor [2025] QCAT 222, 11 [52]–[55]; see also LJY v Occupational Therapy Board of Australia [2025] QCAT 96 (‘LJY‘). ↩︎
  14. LJY (n 13) [18]–[19], [21]–[23]; Mathew Lee, ‘AI Hallucinations and Court Users: LJY v Occupational Therapy Board of Australia and an Emerging UK-Australian Divide?’, Natural and Artificial Law (online, 2025) <https://naturalandartificiallaw.com/ai-hallucinations-by-chatgpt/>. ↩︎
  15. Goodchild (n 7) 9 [29], 11 [36]–[39]. ↩︎
  16. SP v RB as Trustee for the R and R Family Trust AND Others (No. 5) [2025] QIRC 016, 4 [8] (Pratt IC). ↩︎
  17. A’Vard v Mornington Peninsula SC [2025] VCAT 1035, [47]. ↩︎
  18. Ibid. ↩︎
  19. Stewart v Good Shepherd Australia New Zealand [2025] VSCA 206, [62]–[63] (Richards JA). ↩︎
  20. Wang v Moutidis [2025] VCC 1156, [15] (Kirton J). ↩︎
  21. Director of Public Prosecutions v GR [2025] VSC 490, [61]–[80] (Elliott J) (‘GR‘). ↩︎
  22. Ibid [61]; AP, ‘Senior Lawyer Apologises After Filing AI-generated Submissions in Victorian Murder Case’, ABC News (online, 15 August 2025) <https://www.abc.net.au/news/2025-08-15/victoria-lawyer-apologises-after-ai-generated-submissions/105661208>. ↩︎
  23. GR (n 21) [73]; ‘In May 2024, this court published “Guidelines for litigants. Responsible use of artificial intelligence in litigation”. It is essential that all litigants and practitioners adhere to these guidelines.’ quoted from GR [78] (Elliott J). ↩︎
  24. Nikolic & Anor v Nationwide News Pty Ltd & Anor [2025] VSCA 112, [36] (Beach JA). ↩︎
  25. Ibid [36]–[39]; Practice Note SC Gen 23 (n 1). ↩︎
  26. Bangholme Investments Pty Ltd v Greater Dandenong CC [2025] VCAT 290, [14]–[15]. ↩︎
  27. Kaur v RMIT [2024] VSCA 264, [26] (Walker JA). ↩︎
  28. Huang v Champion Homes Sales Pty Ltd [2025] NSWCATAP 271, [123]. ↩︎
  29. Ibid [124]; NSW Civil and Administrative Tribunal, NCAT Procedural Direction 7 – Use of Generative Artificial Intelligence (Gen AI) (Procedural Direction, 7 March 2025) <https://ncat.nsw.gov.au/documents/procedural-directions/ncat-pd7-use-of-generative-ai.pdf>. ↩︎
  30. Howe v Secretary, New South Wales Department of Education [2025] NSWIRComm 1081, [26] (‘Howe‘). Commissioner Muir stated that ‘… it would seem that what has likely occurred is that the Applicant has utilised generative AI to prepare the AS, but regrettably, has not taken any care to ensure that the submissions he is advancing to the Commission are based on actual legal authority or principle’, quoted in Howe [17]. ↩︎
  31. Howe (n 30) [27]. ↩︎
  32. Gribble v ESSENTIAL ENERGY trading as Essential Energy [2025] NSWDC 344, [33] (Gibson DCJ). ↩︎
  33. Ibid [38]. ↩︎
  34. Ibid [39]–[44]. ↩︎
  35. Meniscus Pty Ltd ATF The Meniscus Trust v Chief Commissioner of State Revenue [2025] NSWCATAD 209, [37] (J Smith, Senior Member). ↩︎
  36. Ibid [40]. ↩︎
  37. May v Costaras [2025] NSWCA 178, [3]–[17], [49]. ↩︎
  38. HFI v Commissioner of Police, NSW Police Force [2025] NSWCATAD 171, [40] (J Smith). ↩︎
  39. Ibid. ↩︎
  40. Ibid [40]–[41], [114]; NSW Civil and Administrative Tribunal, NCAT Procedural Direction 7 – Use of Generative Artificial Intelligence (Gen AI) (Procedural Direction, 7 March 2025) <https://ncat.nsw.gov.au/documents/procedural-directions/ncat-pd7-use-of-generative-ai.pdf>; see also Practice Note SC Gen 23 (n 1). ↩︎
  41. Bottrill v Graham & Anor (No 2) [2025] NSWDC 221, [11] (Gibson DCJ). ↩︎
  42. Ibid [14], [68]–[69]. ↩︎
  43. Hugo v Affinity Education Group Pty Ltd [2025] FedCFamC2G 1536, [69] (Liveris J) (‘Hugo‘). ↩︎
  44. Ibid; Finch v The Heat Group [2024] FedCFamC2G161, [137]–[138] (Riley J). ↩︎
  45. Hugo (n 43) [70]. ↩︎
  46. Helmold v Mariya (No 2) [2025] FedCFamC1A 163, [5]–[6] (Aldridge, Campton and Christie JJ). ↩︎
  47. Re Dayal (2024) 386 FLR 359, 360 [1] (Humphries J) (‘Re Dayal‘). ↩︎
  48. Ibid. LEAP (Legal Electronic Access Program) refers to a widely used cloud-based legal practice management software designed in Australia. LEAP has an integrated AI solution. ↩︎
  49. Re Dayal (n 47). ↩︎
  50. Ibid 359, 360 [1], [10]–[15]. ↩︎
  51. Ibid 363 [21]. Office of the Victorian Legal Services Board and Commissioner (VLSBC). ↩︎
  52. JML Rose Pty Ltd v Jorgensen (No 3) (2025) FCA 97, [7] (Wheatley J). ↩︎
  53. Ibid. ↩︎
  54. JNE24 v Minister for Immigration and Citizenship [2025] FedCFamC2G 1314, [1], [26] (Gerrard J) (‘JNE24‘). ↩︎
  55. Ibid [14]. ↩︎
  56. Ibid. ↩︎
  57. Valu v Minister for Immigration and Multicultural Affairs (No 2) (2025) 386 FLR 365 (Skaros J) (‘Valu No 2‘). ↩︎
  58. Ibid; JNE24 (n 54) [18]. ↩︎
  59. Valu No 2 (n 57) 367 [1]–[4], 368 [18]. ↩︎
  60. Ibid 367 [1]–[4], [10]. ↩︎
  61. Ibid 371 [37]–[38]. ↩︎
  62. Rowe v National Australia Bank Ltd [2025] SASC 50, [37] (B Doyle J) (‘Rowe‘). ↩︎
  63. Ibid. The references said not to exist are (full citations redacted): Dawson v Dawson; Williams v The Queen; R v Loveridge; Cranbrook Property Group Pty Ltd v Kim. ↩︎
  64. Rowe (n 62). ↩︎
  65. Hanna v Flinders University [2025] SASC 6, [68] (Hughes J). ↩︎
  66. Ibid. ↩︎
  67. Lakaev v McConkey [2024] TASSC 35, [55] (Blow CJ) (‘Lakaev‘); De L v Director-General, NSW Department of Community Services (1996) 187 CLR 640. ↩︎
  68. Lakaev (n 67). ↩︎
  69. Ibid [56]. ↩︎
  70. Ibid [57]. ↩︎
  71. Ibid [54]. ↩︎
  72. DPP v Khan [2024] ACTSC 19, [39] (Mossop J). ↩︎
  73. Ibid [43]. ↩︎
  74. Nash v Director of Public Prosecutions (WA) [2023] WASCA 75, [9]. ↩︎
  75. Ibid. ↩︎
  76. Ibid [19]. ↩︎

⚖️ Disclaimer: The material on this website is for informational purposes only and does not constitute legal advice. For legal advice, contact an admitted legal practitioner in your jurisdiction. AI tools are not used in the generation of content on this site. While every effort is made to ensure the accuracy of the information provided, readers are advised to consult the primary or secondary sources in all instances.