Legal Malpractice and Chat GPT

Sakkas Cahn & Weiss

It was only a matter of time before it happened.

The news got very excited when Chat GPT-4 passed the bar exam, offering a series of breathless reports on the matter.

AI professionals have warned that Large Language Models like Chat GPT (LLMs) are prone to “hallucinations.” That is, sometimes, they just make stuff up.

Casetext’s Chief Innovation Officer Pablo Arrendondo warned that using large language model AIs were only tools, not “robot lawyers.”

The New York Bar Association issued a warning about potential risks to the legal profession, stating:

“Before using ChatGPT in practice, attorneys have a duty to provide competent representation to their clients. To maintain such competence, attorneys should stay up to date with the benefits and risks associated with relevant technology. Confidentiality, security, and privacy are potential risks associated with ChatGPT.”

Even ChatGPT will generally tell you to consult a lawyer if you ask it any legal questions.

Regardless of all these warnings, some people will look for any shortcut.

It was only a matter of time before some lawyer tried to use Chat GPT to do the bulk of the work for them, burning themselves and their clients.

Worse, the lawyer in question were personal injury lawyers, going up against a major corporate defendant with a crack legal team and deep pockets.

On June 22, 2023, a US judge imposed sanctions on two United States lawyers who submitted a legal brief with six fictitious case citations. Chat GPT wrote the entire brief, including the six cases it cooked up with its little AI imagination. The lawyers stated they did not realize that ChatGPT could fabricate cases. One of them stated he believed that Chat GPT functioned as a “super search engine.” Unfortunately, that is not how Chat GPT or other large language models work.

In reality, these programs work by analyzing which text fragments should follow other sequences, using a statistical model that’s ingested billions of examples from all over the Internet. In other words, it’s very good at emulating a super search engine but isn’t one. It doesn’t have any data later than 2021, and it’s not allowed to have live access to the Internet.

The judge who issued the sanctions said, “There is nothing inherently improper in lawyers using AI for assistance, but lawyer ethics rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”

In this particular case, lawyers continued to double down on the cases being real cases rather than admit to the courts that they’d made a mistake.

We recommend working with lawyers who are a little slower to embrace this emerging technology. There are already some excellent AI case research tools within Lexis Nexus, Westlaw, and Casetext; tools that help legal researchers understand how many times a case has been cited, whether recent cases have contradicted the rulings in certain cases, and whether some cases are out of date. They are all still attached to existing cases that any lawyer can pull and read.

You do not want a lawyer who feels comfortable letting an LLM write your motions and legal briefs.

What if a lawyer does so behind your back? You might have a legal malpractice claim on any number of grounds if you suffered from provable damages leading back to the attorney’s misconduct.

For example, you could make a case that the attorney neglected you by failing to exercise their full professional discretion on your case.

And because everything you type into GPT or other LLMs is available to the companies running those machines, you could claim that asking Chat GPT for legal work is tantamount to discussing your case with others if this sharing leads to financial consequences for you. Open AI employees can view everything you say to Chat GPT. One assumes the same is true for Google and Bard, or Microsoft and Bing.

Confidentiality violations were part of the opinion issued by the NY Bar. And it’s not just Open AI that might view your case information. Recently, a data leak in ChatGPT allowed its users to view the chat history titles of other users. Nearly 1.2% of ChatGPT Plus subscribers had their entire conversations shown to other subscribers.

If the attorney comes out swinging, backed by fake cases, they’re making grievous civil litigation errors, a due diligence error, and are committing lawyer negligence.

Hopefully, the fact that these two lawyers were sanctioned will keep other lawyers from repeating these mistakes. However, if they do not and you suffer, our firm specializes in pursuing legal malpractice cases. We’ll be happy to look at your specific case’s facts and to work with you if you have a reasonable case.

See also:

Can I Hold a Bad Lawyer Accountable For Their Actions?

Did Your Lawyer Commit Legal Malpractice? Don’t Make it Worse!

Did You Hire a Bad Lawyer? Here’s What to Do About It

Recent Posts

Categories

Archives

Have Questions?

Get A Free Case Review

Fields marked with an * are required

"*" indicates required fields

*
This field is for validation purposes and should be left unchanged.
New York Office

110 East 42nd Street
Suite 1508
New York, NY 10017

Phone: 212-571-7171

Fax: 212-571-7174

Elizabeth Office

609 Morris Avenue
Elizabeth, NJ 07208

Phone: 201-659-4144

Fax: 212-571-7174

Garden City Office

1461 Franklin Ave, Suite 2SE
Garden City, NY 11530

Phone: 516-747-7472

Fax: 212-571-7174

Stamford Office

1010 Washington Boulevard
Stamford, CT 06901

Phone: 203-862-8699

Fax: 212-571-7174