The term "artificial intelligence" (AI) was first coined in the 1950s by American IT specialist John McCarthy. Today, AI is the fastest-growing technology in the world. It is also the most important one currently being developed. The Bank of America forecasts that by 2025 robots will perform about 45 % of industrial tasks (today they are responsible for approximately 10 %). Some, like Elon Musk, are saying that AI may soon replace lawyers, too. But is that so?
A litigator's new AI assistant
AI can assist lawyers with document review, proofreading, research and more. Let's look at the top time-saving AI tools used in a litigator's practice: CARA analyses legal documents and returns a list of "suggested cases" that are relevant; ROSS identifies legal issues in the areas of bankruptcy, intellectual property and labour and employment law; ZERO helps recover lost billable time, saves hours spent processing e-mails and reduces errors in e-mail filing; and Docubot by 1LAW transfers information received from a client into a legal document.
AI, can you do the research please?
For litigators, AI tools can be useful in the initial "investigation" stage, where technology can help sift through the huge number of documents and data. For example, when a lawyer has limited knowledge about the company's past and would like to do the due diligence, they need to upload all the documents and data (e.g. the names of the companies, employees, e-mails, contracts, memos, etc.) and apply keywords and date ranges. The software analyses the results to identify all the themes and commonalities, grouping documents together in a coherent and helpful way, so that the links between them are immediately visible. AI can automate such due diligence, saving hours of monotonous and repetitive work. And then the lawyer's review can start.
Also, lawyers can use data points from past case law, win/loss rates and a judge's history to identify trends and patterns.
AI can't read between the lines
Whilst there are some basic procedures that can be automated, human input is always needed for that. AI cannot map every potential fact that might arise. The legislation or the rules might be straightforward, but the full picture may be far more complex. Lawyers are dissuasive and intuitive, they negotiate and advocate and pick up on body language. These are skills that software will always find difficult to replicate, if they are even capable of doing so at all.
The answer is: It depends
No matter what branch or industry, it's no longer a matter of if AI will replace certain types of jobs, but to what extent.
Following Elon Musk's view, when we think about legal practice, we should consider the complexity of contracts, and dive in and understand what it takes to create a perfect deal structure: "It's a lot of attorneys reading through a lot of information – hundreds or thousands of pages of data and documents. It's really easy to miss things." AI has the ability to comb through and deliver the best possible contract for the outcome, he concludes, saying that this would replace a lot of corporate attorneys. Still, even in the future litigation will need human input, reading between the lines, interpreting body language and much more than processing data fast.
"Software will need to be constantly tested, questioned, further developed – just like every litigator's practice as well."
And software will need to be constantly tested, questioned, further developed – just like every litigator's practice as well: The risks of artificial intelligence are significant. Vast amounts of data are available for anybody to mine, manipulate and manage. This datasphere will grow, thus the risks of for example exposing customer or employee data will increase, and it will become hard to protect. Also, AI systems learn from the dataset on which they were trained. Depending upon how this compilation occurred, there is potential for the dataset to reflect assumptions and then influence system decision-making.
To answer the question "How will AI affect litigators?": AI will affect litigators when it comes to repetitive procedures and it already does help saving time. But it also raises many new questions. Importantly: Who will be legally responsible for the AI's outcome, especially if it fails?