With Responsible Use and Advanced Tools, Generative AI Will Change the Way We Litigate
by Jeffrey W. Greene and Elizabeth E. Georgiopoulos
Generative artificial intelligence (“GenAI”) is revolutionizing how businesses and the legal industry operate, providing new opportunities for efficiency, cost savings, and improvements in accuracy. But it also presents unique risks for attorney use, including biased programming, privacy considerations, and lack of transparency. While there are various types of GenAI platforms, publicly available platforms, such as ChatGPT and Microsoft’s Copilot platform, have gained significant attention. Publicly available platforms can present substantial risks for law firms because, among other things, they involve transmitting information to a third party that can use provided data for its own purposes without a contractual relationship and the outputs are not completely reliable. To adhere to professional obligations, counsel must restrict the use of these platforms by never inputting any client or personal information into public GenAI platforms and by independently confirming the accuracy of the results provided.
But as much as law firms should restrict the use of publicly available platforms, law firms can and should utilize private GenAI platforms, either proprietary tools that law firms can create or customized tools from vendors that are private, secure, and designed exclusively for internal use. Although many of these tools are trained with law firm specific data, because they are private, no client or otherwise sensitive data is populated to a public GenAI platform. Such platforms both increase the accuracy of GenAI’s responses and safeguard confidential information.
This article describes how litigators can capitalize on the benefits of GenAI while adhering to their professional obligations.
Potential Uses of Private GenAI Platforms in Litigation
- Preparation of Initial Drafts. GenAI can help counsel prepare initial drafts of communications and agreements, including cease-and-desist letters and correspondence, such as emails to clients explaining legal issues, fee agreements, and engagement letters. Certain tools can also be used to prepare initial drafts of motions, legal briefs, and settlement agreements.
- Legal Research. GenAI-powered research tools, such as Lexis+ AI and Westlaw’s AI-Assisted Research, can automate legal research, scan legal databases, and identify relevant case law, statutes, and regulations. Users can ask a question rather than using Boolean search methods. Many vendors have also developed tools to help counsel identify judge-specific information, including predictions of how judges might rule on particular cases based on their past decisions and behavior.
- Document Review / eDiscovery. GenAI can help streamline discovery by automating the review of documents to identify key information, highlight relevant passages, and even suggest potential legal arguments. The range of uses here is broad.
- Document summarization. GenAI tools can distill critical information, enabling reviewers to understand important issues quickly, and may even be able to respond to specific questions with the information they learn from documents exchanged in discovery.
- Teams can explore data by category or topic, which facilitates review boundary identification, more efficient determination of responsiveness and issue codes, and identification of hot documents.
- Coding Suggestions. Previously-coded documents offer a suggestion for coding a new document as responsive, privileged, or sensitive and can suggest issue codes, saving time and boosting consistency.
- Reasons or Justification. GenAI analysis of content and previously coded documents can offer a rationale for a code of responsive, privileged, or sensitive data.
Existing Technology Assisted Review (“TAR”) and Computer Assisted Learning (“CAL”) technology, also called “predictive coding,” uses computer algorithms to predict which documents are relevant to a case. For example, this technology can prioritize the review of documents most likely to be relevant to the matter by continuously learning from human coding decisions. The above GenAI uses can be used in conjunction with TAR or CAL to increase review efficiency and reduce cost.
- GenAI technologies can summarize and present the most important information from a large volume of documents to help identify exhibits, create summaries and timelines for use in depositions, and even generate a first draft of an outline.
- Case Valuations. By analyzing a large volume of historical case data and providing customized case-specific features, GenAI tools can assist with potential settlement strategies and help provide insight into possible outcomes.
Limitations of GenAI Technology
The rapid advancement of AI technology is outpacing the development of clear regulations and legal standards to govern its use in the legal profession, leading to uncertainty and potential legal challenges. Risk-mitigation strategies to prepare for GenAI’s future unknown legal consequences should address:
- Privacy and Data Security. A leading concern about GenAI relates to data privacy due to confidentiality issues and the risk of unauthorized access or data breaches.
- Reliability and Accuracy. Depending solely on GenAI for legal research or analysis could result in inaccurate or unreliable information, especially if the GenAI system lacks transparency or human oversight.
- Ethical Implications. The well-documented biases in GenAI algorithms, the need for transparency in GenAI-driven decision-making, and the difficulty in ensuring that GenAI systems adhere to legal and professional standards each pose unique challenges.
- Limited Contextual Understanding. GenAI systems may struggle to either understand the broader context of an area of law or to recognize subtle nuances and exceptions that human lawyers are trained to identify.
- AI Bias. If GenAI algorithms are trained on biased data, they may perpetuate existing inequalities and discrimination in legal outcomes, leading to unfair treatment of certain individuals or groups.
- Dependency on Technology. Overreliance on GenAI systems without maintaining a balance of human judgment and expertise may lead to a loss of critical legal reasoning and decision-making skills among legal professionals.
- Accountability and Liability. Determining accountability and liability for AI-generated legal decisions or errors can be complex, especially if the decision-making process of AI systems is not fully transparent.
Mitigating Risk
While private platforms increase the accuracy of GenAI’s responses and protect confidential information, thereby reducing many of the risks presented by publicly available platforms, counsel must be cognizant of the limitations in order to ensure adherence to professional obligations. Key considerations for mitigating risk include:
- Comply with the Rules. Some key rules may not be specifically related to GenAI. For example, under Rule 11 of the Federal Rules of Civil Procedure, counsel must certify that the claims, defenses, and other legal contentions presented to the court are warranted by existing law or by a nonfrivolous argument for extending, modifying, or reversing existing law or for establishing new law, which no GenAI platform can certify. As certain judges and courts require counsel to file certifications regarding their use of GenAI in a case, be sure to consult the court’s and/or judge’s local rules and standing orders when using GenAI.
- Trust But Verify. GenAI has famously generated made-up responses (known as Hallucinations). A recent Stanford research study found that GenAI research tools made by LexisNexis (Lexis+ AI) and Thomson Reuters (Westlaw AI-Assisted Research and Ask Practical Law AI) each hallucinate between 17% and 33% of the time. See Varun Magesh, et al., Hallucination-Free? Assessing the Reliability of Leading AI Legal Research Tools, Stanford University’s Human-Centered AI Group (2024). To comply with professional obligations, counsel cannot rely on the results obtained from AI but must independently confirm their accuracy.
- Protect Sensitive Information. Carefully guard against disclosure of sensitive or private information in public GenAI platforms.
- Understand Plagiarism Risks. The sources used by GenAI models are not always readily apparent to users. Strategies to avoid inadvertently plagiarizing an existing source include taking the time to incorporate counsel’s own style and analysis, verifying the accuracy of the AI-generated content, and including proper citations.
- Create Effective Research Prompts. Counsel should be specific and include as many details (e.g., jurisdiction, date range, procedural history, etc.) as possible to obtain the most accurate result when asking the given program to assist with research. They should also specify the target audience and the desired outcome in the query.
- Help the GenAI Platform Learn. Private GenAI tools can be trained on examples provided by the law firm. By carefully selecting the work product that is used to train the GenAI, the quality of the draft that is generated will likely improve.
- GenAI Use with Clients. Consider discussing the use of GenAI with clients and obtain the clients’ informed consent to either use or not use GenAI for certain tasks.
Conclusion
Private GenAI tools for use in the legal industry are rapidly advancing and can significantly improve the efficiency of many common litigation tasks. By investing the resources to acquire and learn to use private GenAI tools, understanding the limitations of the technology, and adhering to ethical principles, litigators can get the most out of these powerful tools. While GenAI certainly cannot replace litigators, it can help them become more effective attorneys for their clients.
Jeffrey W. Greene is s a shareholder in the Litigation Practice in the Boston office of Greenberg Traurig LLP and is Co-Chair of the firm’s global eDiscovery & eRetention Practice.
Elizabeth E. Georgiopoulos is an associate with Greenberg Traurig with a focus on the areas of electronic discovery, privacy and data protection, and complex commercial litigation.