A US lawyer accused himself in court by referring to judgments from the past that do not exist. The general « air passenger rights expert » simply left the preparation for the process to Chat-GPT.
The repercussions were fatal for him and his clients, because he really shouldn’t have resorted to the decisions he referred to in his submissions. They simply didn’t exist, because they were fictitious. Chat-GPT, for example, had conjured up verdicts against Iran Air or Delta that were never issued. Nor were there any such processes.
For the lawyer, the matter was not only embarrassing, but he also got himself into trouble. He has to swear under oath that he does not want to deliberately deceive the court in New York, but that he negligently relied on Chat-GPT. He acknowledged that it was a serious error that he did not examine the AI’s work before submitting the brief to the court.
The post USA: ‘Air Passenger Rights Attorney’ Accuses Himself With Chat GPT Brief appeared first on Aviation.Direct.
