kazinator
a day ago
> But ["hallucination"] can also refer to an AI-generated answer that is factually accurate, but not actually relevant to the question it was asked, or fails to follow instructions in some other way.
No, "hallucination" can't refer to that. That's a non sequitur or non-compliance and such.
Hallucination is quite specific, referring to making statements which can be interpreted as referring to the circumstances of a world which doesn't exist. Those statements are often relevant; the response would be useful if that world did coincide with the real one.
If your claim is that hallucinations are getting worse, you have to measure the incidences of just those kinds of outputs, treating other forms of irrelevance as a separate category.
rsynnott
a day ago
I mean, 'hallucination' as applied to LLMs has _never_ referred to actual hallucinations. For better or for worse, it has become a blanket term for whatever old nonsense the stochastic parrot vomits forth.
(Personally I never liked the term; it's inappropriate anthropomorphism and will tend to mislead people about what's actually going on. 'Slop' is arguably a better term, but it is broader, in that it can refer to LLM output which is merely _bad_.)
kazinator
a day ago
No of course, not; it refers to generated speech which refer to things which are not there, reminiscent of someone reporting on their hallucinations.
When MacBeth speaks these lines:
Is this a dagger which I see before me, The handle toward my hand? Come, let me clutch thee.
the character is understood to be hallucinating. We infer that by applying a theory of mind type hypothesis to the text.
It's wrong to apply a theory of mind to an LLM, but the glove seems to fit in the case of the hallucination concept; people have latched on to it. The LLMs themselves use the term and explicitly apologize for having hallucinated.