stavros
a day ago
How is it still news that LLMs hallucinate?
pwdisswordfishz
a day ago
Especially on shrooms
user
a day ago
a day ago
How is it still news that LLMs hallucinate?
a day ago
Especially on shrooms
a day ago
a day ago
More importantly who put it in the group?
a day ago
> A moderator for the group said that the bot was automatically added by Meta and that “we are most certainly removing it from here.” Meta did not immediately respond to a request for comment.
a day ago
Whatevers, AI encourages eating dangerous mushroom, there's also the book it wrote that poisoned people https://news.ycombinator.com/item?id=41269514
Logically it's doing this in every conversation.
Millions of incorrect facts being pushed to people. Except it has billions of disconnected person hours of data to get wrong.
It's not stuff like "gypsum is good for clay soil" or "weed tea helps plants grow". These are wives tales and religion like culture. Still structured and as a gardener you can process
With AI there is no culture, it'll tell you something wrong no one else will ever hear. It has no meaning. You're not in the plant when there's a full moon club. It'll be 'smore pop tarts help tomato's grow' club of one.
It also gives you balls to just go do stuff. So tomato tomato. It's no doubt convinced a few people to get cancer checks. But less nihilism would be nice.