AI fabricates 21 out of 23 citations lawyer sanctioned reported to state bar [pdf]

25 pointsposted 5 months ago
by 1vuio0pswjnm7

11 Comments

amilios

5 months ago

Plenty of progress in models that can use tools and search. Would love to see how one of these tool/search-enabled models do at this kind of a task. In my experience, they don't fabricate things anymore, just sometimes occasionally misrepresent the content of citations (put a citation somewhere where it doesn't actually support what is written).

delichon

5 months ago

A few days ago I asked GPT 5 for links to news on the Charlotte murder before the story got reported by the mainstream media. It gave me five different links, including AP and Reuters. Every one, five out of five, was a hallucination.

splix

5 months ago

I'm wondering how did you ask for the links?

It supposed to search for actual documents and then process them (extract content, summarize, giving you the links, and so on).

watwut

5 months ago

It hallucinated complete documentation to the tech we asked it about just 2 weeks ago. Completely made up documentation with only vague relationship.to how it really works.

vinni2

5 months ago

I asked GPT-5 for updated literature survey for a paper I was writing with search enabled and explicit asked to use google scholar arxiv etc and yet most papers were non existent and in some cases even pointed to some GitHub repos which were private.

addaon

5 months ago

Two of them were real? That's a state-of-the-art model, compared to what I've seen…

sixtram

5 months ago

A PhD-level degree in fabrication.

1vuio0pswjnm7

5 months ago

"Appellants counsel has acknowledged that his briefs are replete with fabricated legal authority, which he admits resulted from his reliance on generative AI sources such as ChatGPT, Claude, Gemini, and Grok. Counsel says that he was not previously aware of the problem of AI hallucinations, but he has educated himself about the issue since receiving the OSC."