Sevii
3 days ago
No LLMs are not 'search'. Search as in google or a database query is deterministic. Are there results for x query? If there are we can return them. If there aren't we can't return anything.
LLMs do not work that way. LLMs do not have a conception of facts. Any query you make to an LLM has an output. The quality of that output depends on the training data. For high probability output you might think the LLM is returning the correct 'facts'. For low probability output you might think the LLM is hallucinating.
LLMs are not search. They are a fundamentally different thing from search. Most code is 100% deterministic. The program is executed exactly in order. LLMs are not 100% deterministic.
andrei_says_
2 days ago
One way to think of LLM output is that it is all hallucination. Sometimes it happens to coincide with reality.
To an LLM it’s all the same as there’s no relationship to reality, just to likelihood to reality.
It’s the difference between “this is something that Peter might say” and “this is something that Peter said”. To LLMs there’s no distinction.