Carroll & O'Dea Facebook

When it matters,
you need the
right commercial advice

Contact Us

Publications

Generative AI and the legal profession – opportunity, risk and responsibility

Generative AI and the legal profession – opportunity, risk and responsibility

Published on March 25, 2026 by Sasika Jayasuriya and Thomas FelizziSasika Jayasuriya and Thomas Felizzi

Technological innovation has always shaped the legal profession. From the transition to digital legal research platforms to the widespread adoption of electronic filing systems, lawyers have consistently adapted to tools designed to improve efficiency and productivity. Generative artificial intelligence (Gen AI) represents the next phase of this evolution and is already attracting significant attention within the legal profession.

Gen AI tools are capable of producing written text, summarising documents and assisting with research tasks in a matter of seconds. For lawyers working in demanding and time-sensitive environments, these capabilities present clear benefits. However, while the efficiencies offered by Gen AI are appealing, they are accompanied by important professional risks. Courts across several jurisdictions have already confronted situations in which lawyers relied on Gen AI generated material that contained fabricated authorities or inaccurate legal analysis. These cases highlight an important point, although the technology may be new, the professional responsibilities of legal practitioners remain unchanged.

What is Generative AI?

Generative AI (Gen AI) refers to artificial intelligence systems built on large language models (LLMs). These models are trained using enormous datasets containing text and other forms of information. Through the training process, the system learns patterns in language and relationships between words and concepts.

When a user enters a prompt, the system generates a response by predicting the most statistically likely sequence of words that should follow. This predictive capability enables the program to produce sophisticated and often persuasive responses that resemble human writing.

Unlike traditional artificial intelligence tools, which typically analyse or classify information, Gen AI systems produce entirely new content. This may include written text, summaries, code, images or other forms of material.

While this capacity for content generation is what makes the technology powerful, it also creates a fundamental limitation. Gen AI systems are not designed to determine whether information is correct. Their purpose is to generate plausible language rather than something that is verified truth.

Why lawyers are drawn to AI tools

The attraction of Gen AI for lawyers is easy to understand. Legal practice frequently involves the analysis of large volumes of documents and complex legal materials under really tight deadlines. Tools capable of quickly summarising information or generating preliminary drafts can therefore provide meaningful efficiency gains within the industry.

In particular, Gen AI may assist practitioners by:

  • summarising lengthy documents or transcripts
  • generating early drafts of correspondence or submissions
  • identifying potential legal issues within factual scenarios
  • organising large volumes of information into structured formats
  • assisting with brainstorming legal arguments

Used appropriately, these tools may allow lawyers to spend more time on higher-level legal analysis and strategic thinking. However, the same capabilities that make Gen AI useful also create risks if the outputs are relied upon without proper and required scrutiny.

The risk of hallucinations

One of the most significant limitations of Gen AI is the phenomenon known as “hallucination”. AI hallucinations occur when a generative system produces information that appears credible but is in fact incorrect or entirely fabricated. In legal contexts, this may include fictional case law, inaccurate citations or legal propositions that do not exist. Because Gen AI produces responses based on probability rather than verified sources, it may confidently present incorrect information if it appears linguistically plausible.

For lawyers, the consequences of relying on such information can be significant. Submissions containing fabricated authorities may undermine the credibility of the practitioner and expose them to professional consequences. Various courts in both Australia and the United States of America have already addressed situations where lawyers have relied on Gen AI outputs without verifying their accuracy.

Professional responsibility remains unchanged

The emergence of Gen AI does not alter the fundamental duties owed by lawyers to the court and to their clients. Practitioners remain responsible for the accuracy and integrity of the material they submit in legal proceedings. This includes ensuring that authorities are genuine, that legal arguments are properly supported and that submissions are based on reliable sources.

Gen AI may assist with drafting or research tasks, but it cannot replace the professional judgment that lies at the heart of legal practice. Various courts in Australia have made it clear that reliance on technology does not excuse failures to meet professional standards. Lawyers remain responsible for the work they produce regardless of the tools used in its preparation.

A balanced approach to AI in practice

It would be incorrect to suggest that Gen AI should be avoided entirely. Like many technological innovations, it has the potential to improve efficiency and productivity within the profession. However, lawyers must approach these tools with a clear understanding of their limitations. Outputs produced by Gen AI should be treated as preliminary material rather than authoritative sources.

Independent verification of legal authorities and factual propositions absolutely remains essential. Ultimately, the technology may assist legal practitioners, but it cannot replace the professional skill, judgment and ethical obligations that define legal practice.

As generative AI continues to develop, the challenge for the profession will be to harness its benefits while maintaining the standards of accuracy and integrity upon which the administration of justice depends.

This article was published on 25 March, 2026 by Carroll & O’Dea Lawyers and is based on the relevant state of the law (legislation, regulations and case law) at that date for the jurisdiction in which it is published. Please note this article does not constitute legal advice. If you ever need legal advice or want to discuss a legal problem, please contact us to see if we can help. You can reach us on 1800 059 278 or via the Contact us page on our website. www.codea.com.au.

Need help? Contact us now.

We're here to help. For general enquiries email or call 1800 059 278.
For Business lawyers call +61 (02) 9291 7100.

Contact Us