Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

The lawyer used ChatGPT to file a lawsuit and was tricked into quoting a case that did not exist.

2025-04-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

CTOnews.com, May 28 (Xinhua) A lawyer recently submitted incorrect information by relying on ChatGPT, a chat robot, for legal literature retrieval (legal research) in a court case, the New York Times reported. The incident revealed the potential risks of artificial intelligence in the legal field, including misinformation.

The case involved a man suing an airline for personal injury, and the plaintiff's legal team submitted a brief lawsuit citing several previous court cases to support their arguments. trying to set a legal precedent for their claims. However, the airline's lawyer found that some of the cited cases did not exist and immediately notified the presiding judge.

Judge Kevin Castro expressed surprise at the situation, calling it "unprecedented" and ordered an explanation from the plaintiff's legal team.

Steven Schwartz, one of the plaintiffs' lawyers, admitted that he used ChatGPT to search for similar legal precedents. In a written statement, Schwartz expressed deep regret, saying: "I have never used artificial intelligence for legal literature search before, and I do not know that its content may be false."

The documents submitted to the court were accompanied by screenshots of a dialogue between Varghese and ChatGPT, in which Schwartz asked about a specific case: whether Varghese v. China Southern Airlines Co Ltd was real. ChatGPT replied that it was true and that the case could be found in legal reference databases such as LexisNexis and Westlaw. However, follow-up investigations revealed that the case did not exist, and further investigation revealed that ChatGPT fabricated six cases that did not exist.

In view of the incident, the two lawyers involved, Peter Loduca and Steven Schwartz of Levidow, Levidow & Oberman, will attend a disciplinary hearing on June 8 to explain their actions. CTOnews.com noted that the incident triggered a discussion in the legal community about the appropriate use of artificial intelligence tools in legal research and the need to develop comprehensive guidelines to prevent similar situations from happening.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report