Op-Ed: Should You Depend On AI To Navigate Legal Issues?

Editorial Note: Opinions and thoughts are the author’s own and not those of AFROTECH™.
It’s easy to assume that just because you can use AI for something that you should use AI for everything. People are already using it to create websites, images, and videos. These uses of AI do not carry a lot of penalties if something is wrong (depending on what you are building). However, an area in which people are increasingly using AI and should be cautious is the legal system — navigating legal issues, creating legal documents, or doing legal research.
Ever since the release of ChatGPT over 2 years ago, we have seen people rush to bring AI to the world of professional services. Founders and investors alike have increasingly seen opportunity in leveraging AI to help in-house attorneys become more effective, and the everyday citizen has been looking to use AI instead of paying the cost of employing an attorney.
Legal tech companies have made it their mission to reinvent legal professions and workflows from the ground up, using AI to automate the mundane tasks they do so they are able to focus on higher-value tasks. The economic potential of this has caused the investment in AI-focused legal tech companies to grow in recent years. Crunchbase reported that 79% of legal startup investments since 2024 — around $2.2 billion — have a focus on AI.
Given how often venture capital firms and owners of companies have to work with their legal counterparts, it makes sense that we see venture dollars going into a profession that people have familiarity with. And considering the amount of investment, there are a lot of players looking to create more AI-enabled attorneys, but there are some companies that have risen to the top.
Harvey was founded in 2022 by Winston Weinberg, who was a former antitrust and securities litigator, and Gabriel Pereyra, who previously worked in AI at Google, Deepmind, and Meta. While most of the venture capital attention goes towards automation, Harvey’s founders decided to focus on augmentation and being an assistant to lawyers and attorneys instead of a replacement for them. Harvey helps firms with everything from contract analysis to litigation assistance and regulatory compliance, and the company primarily focuses on the Fortune 500 or Big Law firms. As reported by TechCrunch, it recently raised $300 million in a Series E funding round and is valued at $5 billion. While Harvey has claimed headlines here in the U.S., there is a firm based in the United Kingdom that is grabbing headlines across the pond.
The Solicitors Regulation Authority, an independent body that aims to keep high professional and ethical standards in the United Kingdom’s legal profession, reported that it has authorized Garfield.Law ltd as the first purely AI-based firm to operate in England and Wales. Garfield’s focus is to enable small and medium-sized businesses and large enterprises to recover unpaid debts in small claims court in the English and Welsh system. This is a case where AI works well because in those systems, small claims have to go through specific, mundane steps, and those types of systems are always ripe for automation. What we are seeing is that augmenting workers and automating workflows has a place for AI as it grows in the legal profession. However, companies aren’t the only ones looking to see how they can leverage AI. People are trying to use it to navigate legal scenarios in their daily lives as well.
Fox 5 Atlanta reported that Georgia’s court of appeals has fined an Atlanta-based attorney for citing cases that did not exist in their motion for a client’s divorce case. It was also found that there were references made that did not exist. The attorney was fined $2,500, and the case has been sent back down to the lower court to be reconsidered. A real risk of using AI in legal matters is that it can have a tendency to make things up, which is what AI researchers call “hallucinations.” These hallucinations can put people in hot water if they are used to justify positions or make decisions that have legal implications attached to them. Situations like these make it so I would not recommend that people use AI, unless they are legally trained themselves to navigate legal situations. I understand the incentive to save money by not contacting an attorney and turning to AI, but that can cause more costly issues down the road if people are not careful.
I believe that AI will continue to be leveraged in legal offices and law firms around the world to give attorneys their time back in ways that will be beneficial to their clients and their case load. But I am wary of the average person believing that they can represent themselves because ChatGPT told them they could. Often, people believe that what is most cost effective is the most effective, and that is not always the case.