by Dyane O’Leary
Artificial intelligence, or AI, has held a spot in legal workflows for years (e.g., natural language searching). But when ChatGPT exploded on the scene as the fastest growing consumer app ever, it sparked fresh enthusiasm and interest about this new “Generative AI” or “GenAI”—along with skepticism and distrust from a historically conservative profession.
On the one hand, the technology is mind-blowing. It strikes at the heart of something many lawyers consider their most precious commodity: words. The machine learning that fuels large language models on which tools like ChatGPT are based is a super-charged distant cousin of the basic predictive text and autocomplete we enjoy on smartphones. A common refrain is that these tools simply “predict the next word.” True, but oversimplified. Based on more training language than we can imagine (in some cases, the entire internet), these systems make billions of mathematical language connections, mimicking human patterns, context, syntax, and creativity.
On the other hand, GenAI is just that: a tool. Some lawyers use contract analytics software, others don’t. Some use knowledge management tools, others Ctrl-F their way through old files. The ins and outs of work product and the business administration of law are nuanced. The end goal is competent client representation, and lawyers enjoy endless choices among process paths.
So how do we navigate this new GenAI era? Sensationalized stories of AI making up cases (known as hallucinations) are serious, but they are a sliver of a complicated and evolving picture. It is easy to dismiss the chatter as hype; it is much harder to separate fact from fiction and stay “abreast” of the “benefits and risks associated with relevant technology” per Comment 8 of Mass. R. Prof. Conduct 1.1. At this early stage in the arc of new technology, here are some baseline “Need to Knows”:
- Not All GenAI Tools Are the Same: “ChatGPT” is misused as a generic term for every GenAI platform. In a year’s time, the market has flooded with competition for the next best language model tool, whether free, paid, or an enterprise version designed for an organization or business. The differences between these tools are real, from how many characters are accepted in a prompt (the instructions a user provides to the AI) to whether a user can upload their own document(s) against which the tool works its magic. Behind these generic models, a laundry list of “GenAI for lawyers” offerings emerge. See, e.g., AI Tools for Lawyers: Improving Efficiency & Productivity in Law Firms, www.clio.com; https://theresanaiforthat.com/. Compared to public models, most of these tools—especially those from reputable legal research and law practice management providers—tout two notes that should be music to lawyers’ ears: (a) greater security for work product and client information; and (b) greater protection against hallucinations. This is because the models work with source-specific guardrails around them, such as a trusted legal database of primary law or a set of deposition transcripts. And they purport to do so with the same level of security lawyers have relied on for decades to, for example, protect online search term queries. They combine the word-prediction power of GenAI with familiar and reliable sources of legal information. So what tools should you use? That depends on why and how you are using it. One size does not fit all. Appreciate and research the differences. Choose wisely.
- The Art of Prompting: Speaking of use, ‘garbage in’ will result in ‘garbage out’. Prompting is an interactive back-and-forth akin to directing a junior colleague—not a one-stop quick natural language search extracting a phrase on Google or a research database. Effective prompting is rarely a one-and-done task. Tools are growing more powerful but for now, prompting still matters, especially with non-legal tools. The instructions combine to direct the vast spinning wheel of the model where to land. Here are some tips:
-
- Persona Priming: Offer context & background “You are a lawyer interested in a simple and accessible writing tone for someone without a legal background.”
- Persona Priming: Offer context & background “You are a lawyer interested in a simple and accessible writing tone for someone without a legal background.”
-
- Prompt the Task: Give specific instructions. “Draft a description of what reasonable accommodations mean in the context of the ADA.”
-
- Specify Format & Adjust/Polish: Adjust outputs. “Turn that summary of ADA law into a bullet point list with ten different one-sentence ideas.”
- Finding Best Use Cases: Legal work ranges from the more mundane (organize, summarize, edit) to substantive (analyze, compare, argue). GenAI will not fit every step. Builders do not use a wrench to hammer a nail. It could work in a pinch but it is not the best tool to use. So why is it a surprise when a model trained on internet language, from Reddit comments to random websites, cannot provide a correct legal citation and perfect cover email for a nuanced query? That’s not its best use. GenAI is great at summarization and extraction. New whitepaper on a hot issue? Digest it in seconds. Struggling to get past the blank page for marketing content? Prompt a first draft. Preparing a witness for trial? Ask GenAI for a moot cross-examination outline. Creating a novel policy-driven argument in an appellate brief? GenAI may not be your best bet. Consider GenAI a productivity tool in more legal-adjacent aspects of work.
- The Ethics of GenAI Use: This topic alone deserves a deep dive. See Andrew M. Perlman, The Legal Ethics of Generative AI, Suffolk U. L. Rev. (forthcoming 2024) ( 22, 2024 draft). State bars such as Florida and California published early advisory opinions. See FL. Eth. Op. 24-1 (Fla. State Bar Ass’n. Jan 19, 2024); State Bar Standing Comm. On Pro. Responsibility and Conduct, Practical Guidance for the Use of Generative Artificial Intelligence in the Practice of Law, State Bar of Cal. 1, 1 (Nov. 16, 2023). There is a growing list of judicial admonishments of lawyers who included fictitious case citations in filings, starting with the Southern District of New York in June 2023 and then in our own backyard from Massachusetts Judge Brian Davis in February 2024. See Mata v. Avianca, No. 22-cv-1461, (S.D.N.Y. June 22, 2023); Smith v. Farwell, No. 2282-cv-01197 (Mass. Super. Ct. Feb. 12, 2024). Though plenty of gray area remains, some themes are emerging:
-
- Rule 1.1 Competence & Rule 1.3 Diligence: These rules capture the duty to “trust but verify” outputs. Blind acceptance is at best risky. Consider a sliding scale. The more important and nuanced the use, the more verification is needed with primary sources. The more basic the use, as with a quick summary, for example, the less verification might be needed. Reviewing GenAI outputs requires what these rules have always required: careful discretion.
-
- Rule 1.6 Confidentiality: Lawyers have long been advised to use “reasonable efforts” to ensure ethical use of third-party software providers for storing or transmitting client information and work product over the internet. See, e.g., ABA Standing Comm. on Ethics & Prof. Resp., Formal Op. 477R (2017) and Formal Op. 498 (2021); see also R. Prof. Conduct 5.3 Cmt. 3. Absent informed consent, prompting a public tool like ChatGPT with client information or uploading work product would likely not be permissible, since even with opt-out choices in the terms of service, many companies offering free versions retain the right to review and use inputs to train underlying language models. But paid enterprise versions or law-specific tools from reputable providers may offer the same data protection procedures lawyers have relied on to use common internet tools like Westlaw, Microsoft OneDrive, and Zoom. The tool-specific details matter.
-
- Rule 3.3 Candor Toward the Tribunal: Several courts and judges instituted orders requiring lawyers to disclose use of AI in filings. See Ropes & Gray, Artificial Intelligence Court Order Tracker. This wave seems to have crested for now, but lawyers should know of the possibility of these orders—especially in federal court—and ensure compliance.
-
- Rule 5.3 Nonlawyer Assistance: Supervision extends to human assistants with whom a lawyer works and non-human technology assistance. Beyond diligence in selecting vendors and monitoring compliance, lawyers must give paralegals, support staff, and other specialists training and direction about policies or prohibitions. Law students joining the bar are the ChatGPT generation. They will depend on guidance to transition personal habits into new professional environments.
Plenty of other questions remain—chief among them involves Rule 1.4 Client Communication. Most guidance stops short of mandating disclosure to a client of use of GenAI. Lawyers likely do not disclose particulars of tools such as calendaring systems or trial presentation technologies. But guidance suggests a balance based on the risks and intended manner and scope of the use. The question may not be whether you must inform clients but whether you should. And what about fees? No lawyer wants to field a call from a client confused about an expensive “prompting” time entry. Lawyers can charge for time spent using GenAI tools under Rule 1.5 but of course should not overbill when significant time was saved.
So is GenAI a legal tool? In the right hands with the right approach under the right ethical boundaries, sure—it is “superhuman legal support.” Smith v. Farwell, supra. But it can also be a distracting toy. Unless it is for personal experimentation, use of GenAI for the sake of using GenAI is a waste of time. Legal tools are a means to an end: competent client representation. As lawyers, we will still have our place. Technology will too. The key is how we will work together.
Professor Dyane O’Leary directs the Legal Innovation & Technology Institute at Suffolk University Law School and is the author of Legal Innovation & Technology: A Practical Skills Guide for the Modern Lawyer. Nothing in this article is offered as legal advice or ethics guidance and lawyers should consult updated jurisdiction-specific materials.