- Utilizing ChatGPT, an attorney drafted an affidavit for a personal injury lawsuit directed at an airline.
- The airline’s lawyers and the presiding judge faced challenges in locating various court decisions that were mentioned.
- The reason behind this difficulty was the fact that those cases were entirely fabricated.
The soaring popularity of ChatGPT in recent months has brought about both optimism and skepticism regarding this new generative AI program. However, the tool has become the focal point of a disciplinary case involving a New York lawyer. Steven Schwartz, a personal injury attorney at Levidow, Levidow & Oberman, is scheduled for a sanctions hearing on June 8th after it came to light that he utilized ChatGPT to draft an affidavit. Another lawyer from the same firm, Peter LoDuca, is also facing sanctions but stated in a court filing that he did not conduct any research for the affidavit.
The affidavit in question was prepared using ChatGPT for a lawsuit concerning an individual who claimed to have sustained injuries from a serving cart on an Avianca flight. It included several fabricated court decisions. In an order, Judge Kevin Castel described the situation as an “unprecedented circumstance,” pointing out that six of the cited cases appeared to be fictitious, featuring fabricated quotes and internal citations. Neither the airline’s lawyers nor Judge Castel were able to locate the mentioned cases.
Bart Banino, an attorney representing Avianca from Condon & Forsyth, informed The New York Times that his firm identified the cases as fake and initially had doubts about the use of a chatbot.
Schwartz apologized to Judge Castel, acknowledging that he had never used the AI tool before and was unaware that its content could be false, as reported by The Times. He further commented that ChatGPT had proven itself to be an unreliable source. At the time of publication, Avianca, LoDuca, and Schwartz had not responded to requests for comment from Insider.