Is the LLM response wrong, or have you just failed to iterate it?

2 pointsposted 5 months ago
by speckx

2 Comments

JohnFen

5 months ago

"You're holding it wrong!"

The problem, of course, is that people will, and do, take what these systems say as truth at first blush. They won't iterate. If they were inclined to do that, they'd probably engage in actual research in the first place rather than taking the easy path of asking an LLM. The whole promise of using an LLM for these sorts of tasks is minimizing effort.

pavel_lishin

5 months ago

Am I hallucinating?

No, it is the users who are out of touch.