News

A man sues airline Avianca. His lawyer used ChatGPT.


The lawsuit started like so many others: A man named Roberto Mata sued Avianca Airlines, saying he was injured when a metal food truck crashed into his knee during the flight. flew to Kennedy International Airport in New York.

When Avianca asked a federal judge in Manhattan to drop the case, Mr. Mata’s lawyers vehemently objected, submitting a 10-page summary citing more than half a dozen relevant court decisions. . There’s Martinez v. Delta Air Lines, Zicherman v. Korean Air Lines and, of course, Varghese v. China Southern Airlines, with erudite discussion of federal law and “the charging impact of automatically maintaining the statute of limitations.”

There’s just one hitch: No one – not the airline’s attorney, not even the judge himself – can find the decisions or quotes cited and summarized in the summary.

That’s because ChatGPT invented everything.

The attorney who created the brief, Steven A. Schwartz of Levidow, Levidow & Oberman, asked the court for pardon on Thursday, saying in an affidavit that he used an artificial intelligence program. to do her legal research – “a source has revealed itself to be unreliable.”

Mr. Schwartz, who has practiced law in New York for three decades, told Judge P. Kevin Castel that he did not intend to deceive the court or the airline. Mr. Schwartz said he had never used ChatGPT and was “thus unaware of the possibility that its content could be false.”

He told Judge Castel that he even asked the program to verify that the cases were real.

It said yes.

Mr Schwartz said he “deeply regrets” relying on ChatGPT “and would never do so in the future without absolute verification of its authenticity.”

Judge Castel said in an order that he was faced with “an unprecedented situation,” a legal submission filled with “bogus judicial decisions, with bogus citations and bogus insider quotes.” He ordered a hearing on June 8 to discuss potential sanctions.

As artificial intelligence sweeps the online world, it has evoked outdated scenarios of computers replacing not only human interaction but also human labor. The fear is particularly intense for knowledge workers, many of whom worry that their daily activities may not be as rare as the world thinks – but the world pays for the hours that are available. billable.

Stephen Gillers, a professor of legal ethics at New York University School of Law, says the issue is particularly acute among attorneys, who are debating the merits and dangers of AI software like ChatGPT, as well as the need to verify any information it provides. .

“The current discussion among attorneys is how to avoid exactly what this case describes,” Mr. Gillers said. “You can’t just take the output, cut it and paste it into your court records.”

The Real Case of Roberto Mata v Avianca Inc. suggests that office workers may have at least some time left before robots take over.

The lawsuit began when Mr. Mata was a passenger on Avianca flight 670 from El Salvador to New York on August 27, 2019, when an airline employee pushed him with a service trolley, according to the lawsuit. After Mr. Mata filed the lawsuit, the airline filed a petition to dismiss the lawsuit because the statute of limitations had expired.

In a brief submission in March, Mr. Mata’s lawyers said the case should go ahead, bolstering their argument with references and quotes from numerous court decisions that have since been removed. error.

Soon after, Avianca’s attorneys wrote to Judge Castel, saying they could not find the cases cited in the summary.

When it comes to Varghese v. China Southern Airlines, they say they “cannot identify this case by footnotes or citations, nor do any cases bear any resemblance to It.”

They point to a lengthy quote from Varghese’s purported decision contained in the abstract. Avianca’s attorneys wrote: “The undersigned could not find this quote, nor anything like it, under any circumstances.

Indeed, the attorneys added, the quote, coming from Varghese himself, cited something called Zicherman v. Korean Air Lines Co. Can’t find that either.

Judge Castel ordered Mr. Mata’s lawyers to provide copies of the comments mentioned in their brief. The attorneys submitted an eight-person brief; In most cases, they list the court and the judges who issued them, the notebook number, and the date.

For example, a copy of Varghese’s alleged decision is six pages long and says it was written by a member of the three-judge panel of the 11th Circuit. But Avianca’s lawyers told the judge they couldn’t find that opinion, or others, on court records or legal databases.

Bart Banino, Avianca’s attorney, said that his firm, Condon & Forsyth, specializes in aviation law and that its attorneys could say the cases in the summary were bogus. He added that they had a vague feeling that a chatbot might have been involved.

Mr. Schwartz did not respond to a message seeking comment, nor did Peter LoDuca, another attorney at the firm, who is named in the summary.

Mr LoDuca said in an affidavit this week that he did not conduct any of the research mentioned and that he had “no reason to doubt the sincerity” of Mr. authenticity of opinions.

ChatGPT generates realistic responses by guessing which pieces of text will follow other sequences, based on a statistical model that imported billions of text examples from around the internet. In Mr. Mata’s case, the program seems to have recognized the labyrinthine framework of a written legal argument, but filled it with names and facts from a range of existing cases.

Judge Castel, on order to convene a hearing, suggested that he conduct his own investigation. He wrote that the 11th round clerk confirmed that the ledger number on the purported Varghese opinion was related to an entirely different case.

Calling the opinion “bogus,” Judge Castel noted that it contained internal quotes and citations that, therefore, do not exist. He said that five of the other decisions filed by Mr. Mata’s lawyer also appeared to be forged.

On Thursday, Mr. Mata’s lawyers released affidavits containing their version of what happened.

Mr. Schwartz wrote that he initially filed a lawsuit against Mr. Mata in state court, but after the airline referred the case to Manhattan’s federal court, where Mr. Schwartz is not authorized to practice, one of his colleagues His brother, Mr. LoDuca, became the attorney of the record. Mr. Schwartz said he had continued to do legal research, in which Mr. LoDuca had no role.

Mr Schwartz said that he consulted ChatGPT “to supplement” his own work and that, “in consultation” with it, found and cited half a dozen schools case does not exist. He said ChatGPT provided reassurance.

“Is Varghese a real case,” he typed, according to a copy of the exchange he filed with the judge.

“Yes,” replied the chatbot, giving a quote and adding that it was “a real case.”

Mr. Schwartz dug deeper.

“What are your sources,” he wrote, according to the filing.

“I apologize for the confusion earlier,” ChatGPT replied, giving a legal quote.

Mr. Schwartz asked: “Are the other cases you provide fake.

ChatGPT replied: “No, the other cases I provided are real and can be found in reputable legal databases.”

But, alas, they cannot.

Sheelagh McNeil contributed research.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button