May v Costaras [2025] NSWCA 178

A hard line on AI in the courtroom

Last month the New South Wales Court of Appeal handed down a landmark decision on how Australian courts will treat the use of generative AI in litigation. The NSWCA didn’t just warn against unsupervised AI use, but instead drew a line (in red underline): AI cannot ethically stand alone in preparing legal submissions.

The case at a glance

The dispute itself was a property matter between former de facto partners, Michael May (appellant) and Lila Costaras (respondent), who were joint tenants in an investment property in Scott Street in Maryborough, Queensland.

May and Costaras met in late 2020 and soon began a relationship and by early 2021 they were living together in NSW. They both had children from previous relationships, and over time they began pooling their efforts into property renovation projects.

In February 2022, they moved to Queensland after buying a house in Ann Street, Maryborough, which they lived in together. Costaras contributed heavily to renovations on another of May’s properties, doing both hands-on work and project management.

A few months later, in June 2022, they decided to buy another property — a small house in Scott Street, Maryborough — as a joint investment. The property cost approximately $180,000. The money came out of May’s bank account, but the purchase was put in both their names as joint tenants.

The decision to include Costaras on the title wasn’t accidental. She had pressed for recognition of the work she had already done on renovations and wanted to make sure her contributions would be properly recognised in future projects. May agreed to this, and they went ahead together in the decision to become joint tenants.

Unfortunately, not long after, the relationship broke down. This left the question of who really owned what share of the Scott Street property, given May had provided the purchase money but Costaras had made significant contributions to their joint projects and expected to benefit.

“AI may assist with research and drafting, but it cannot replace genuine legal analysis.”

When AI puts on submissions

When the relationship broke down, May argued that because he had paid the full purchase price, Costaras should not be entitled to keep her legal share of ownership.

The case went before the Supreme Court of New South Wales, where the judge found that while May had provided the funds, Costaras had made substantial contributions through her renovation work and project management. On that basis, the judge ruled the property should be divided two-thirds to May and one-third to Costaras.

Unhappy with this outcome, May appealed to the NSW Court of Appeal, hoping to secure a larger share (if not full ownership) of the property. Costaras, however, did not have a lawyer for the appeal and chose to represent herself.

It was at this stage that the unusual twist occurred and AI entered the courtroom.

Costaras admitted that she had relied on a generative AI tool to prepare her oral submissions. The judges quickly discovered some serious problems: her material cited a case that did not exist, included references to authorities with no relevance to the dispute, and put forward arguments that were incoherent or disconnected from the real issues before the court.

The Court of Appeal ultimately dismissed May’s appeal, leaving the original property orders intact. But beyond that, the judges issued a strong warning about the dangers of using AI in court without human oversight. They emphasised that while AI may assist with research and drafting, it cannot replace genuine legal analysis — and when used unsupervised, it risks wasting court time, inflating costs, and undermining the integrity of proceedings.

“AI submissions added cost and complexity without appreciable benefit.” — Bell CJ

Warnings from the Bench

The Court of Appeal used May v Costaras to send a strong signal that AI-generated submissions, when unchecked, do more harm than good. Chief Justice Bell and Justice Payne were clear that the material produced in this case only added to the cost and complexity of proceedings without offering any meaningful assistance. Instead of sharpening the legal issues, the AI script confused them.

The judges’ comments turn a growing concern into a firm directive from the Australian Judiciary. For legal practitioners, the message is threefold:

  1. Independent verification is essential. Every authority, quotation, or proposition that comes from an AI tool must be checked against authoritative sources like AustLII, JADE, LexisNexis, or Westlaw.

  2. Legal reasoning cannot be outsourced. AI may draft words, but it cannot weigh credibility, nuance, or context — all of which are hallmarks of legal judgment.

  3. Disclosure will matter. Courts are signalling that if AI is used to prepare key documents such as affidavits, witness statements, or expert reports, parties may be expected to say so upfront and be ready to explain how accuracy was ensured.

The underlying warning is clear: unsupervised AI creates more risk than reward in litigation.

Importantly, their Honours stopped short of laying down a comprehensive disclosure rule. The judgment did not address, for instance, the use of AI in large-scale eDiscovery or document review. What can be drawn from May v Costaras, however, is a strong signal that where AI contributes to materials filed with the court — particularly affidavits, witness statements, or expert reports — disclosure will be expected. In other words, if AI use has the potential to affect the integrity of evidence or submissions, courts are likely to require transparency.

“Competence now means knowing the law and the limits of AI tools.”

Lawyers Professional duties in the AI Age

The case also shows that the professional duty of competence is evolving. In 2025, being a competent lawyer is not only about knowing the law — it also requires understanding the capabilities and limits of the digital tools you choose to use.

Solicitors must therefore:

  • Vet AI outputs line by line. Accuracy, completeness, and relevance must all be confirmed before anything reaches a court.

  • Avoid misleading conduct. Even unintentional reliance on fabricated or irrelevant material can mislead the court, with serious consequences.

  • Protect client confidentiality. Sensitive material should never be entered into unsecured public AI platforms.

  • Be ready to demonstrate verification. If challenged, lawyers must be able to show exactly how they confirmed that the authorities and arguments they relied upon were genuine and accurate.

The stakes are high. Failure to meet these obligations could lead to judicial censure, costs orders against you or your client, or even disciplinary action for breaching ethical duties.

Practical Takeaways for Practitioners

So what can lawyers looking to not land themselves in the same position as Costaras do?

  1. Verify everything. Always cross-check citations and case law in trusted databases.

  2. Check relevance. Do not let AI “pad” submissions with tangential or spurious material.

  3. Disclose where required. Be upfront with courts and clients about the use of AI where practice directions or ethics rules demand it.

  4. Document your review. Keep a record of what you checked, corrected, or removed from AI outputs. This shows diligence and protects you if questions are raised later.

  5. Use AI as a tool, not a crutch. AI can speed up drafting and research, but it cannot replace professional skill and responsibility.

“Verify everything before you file.”

May v Costaras is the first appellate-level decision in Australia to directly confront the misuse of AI in litigation, and its message is clear: AI can assist but it can never substitute for genuine legal expertise.

For the profession, the takeaway is not to reject these tools out of fear, but to use them responsibly. It’s a balancing act where innovation must always be paired with integrity. In court, credibility is currency, and once it is lost, no AI tool can restore it.

Next
Next

Streamlining Without Compromising: How to Balance Efficiency, Compliance, and Accountability