How will AI revolutionise the practice of law?
Is AI a revolutionary way for legal practice to become more innovative and efficient? Or is it a misguided cheat code that will cause more problems than it solves? After all, clients don’t necessarily want a chatbot solving their legal problem. They want a trained legal mind.
As with most things, the answer probably lies in a hybrid model. AI can do some of the repetitive, time-consuming work, which frees up lawyers for the technical aspects. But there are limits to AI’s capabilities.
Bespoke legal tech
One of the emerging technologies that has taken the legal world by storm is Harvey AI. It’s backed by Open AI, the force behind Chat GPT, and shares similar technology.
It was first adopted by A&O Shearman (who were Allen & Ovary at the time) in February 2023, which springboarded its credibility in the industry. Harvey uses natural language processing, machine learning, and data analytics to automate tasks like contract analysis, due diligence, and regulatory compliance. It works in different languages and can answer questions quickly.
The reason Harvey has been particularly successful is that its trained in legal data like case law. It can also be trained on a law firm’s own precedents and templates, much like an onboarding process for a new employee.
E-disclosure
Many law firms are using AI to speed up time-intensive tasks like disclosure and document reviews. Large language models can be trained on seed sets of data to identify the most relevant material first, so that reviewers can find key documentation more quickly. It makes these tasks more efficient and cost effective for the client. For that reason, this is one of the most popular ways that lawyers are using AI.
First drafts
Can AI prepare a first draft for you? Perhaps it can draft letters, advice notes, or even pleadings.
The problem with this is that the output of AI will sound good, confident, and assured. But if the reviewer hasn’t sufficiently turned their mind to the problem (as you do if you’re writing from scratch), then it’s likely to lack nuance.
It’s probably best to do a first draft yourself first, and then ask AI to have a go later. That way, you may pick up on points that you overlooked the first time, but it’s also easier to spot places in which the AI output may be wrong. It should be more of a sense-checking tool in this respect, rather than a reliable draftsman.
However, there are tools out there that are able to do things like prepare a first draft of a lease report. It will scan the document and pull out the key information. AI is useful for this sort of ‘extraction’ method. But it cannot be relied upon for reasoning.
Research
Will we see teams of lawyers replaced by Chat GPT for research?
Again, this seems sensible at first glance. Chat GPT can scrape the entire internet for content and produce an answer in seconds.
However, it does not necessarily grasp the nuance of case law. It may be able to present a researcher with relevant case law, but it won’t necessarily grasp the meaning of that case law and the application to the facts at hand.
Pitfalls of using AI
AI can be a helpful tool, but it is by no means a panacea.
Using Chat GPT carries with it the possibility of costly data breaches for any law firm. The technology is continually trained on the input it is given, which includes input from users. If employees of the firm are inputting confidential, sensitive, or privileged information, there is no limitation on how Chat GPT may use this information. It could use that confidential data as an answer to another query, jeopardising the privacy of that information.
Other embarrassing examples include lawyers, or litigants in person, who have relied on Chat GPT to draft pleadings. Chat GPT has ‘hallucinated’ case law, which doesn’t exist. These made up cases have made it into pleadings, only to be challenged and uncovered as inauthentic by the judge. Not only is it embarrassing for the lawyers, it also misleads the court, which could be a breach of lawyers’ professional obligations. As officers of the court, lawyers could be on the hook for disciplinary proceedings, or even lose their right to practice law altogether if they are found to mislead the court.
As AI improves and becomes more integral to work life, lawyers need a belt and braces approach to minimise the risk of data breaches, incorrect outputs, and substandard work. But it can speed up time-consuming tasks if it is used correctly, with sufficient oversight from trained legal minds to verify it’s output.