10 pointsposted a month ago
by MohskiBroskiAI

Item id: 46491838

13 Comments

verdverm

a month ago

> zero-hallucination guarantees

This is impossible, either your testing is wrong, incomplete, or you are the one hallucinating

MohskiBroskiAI

a month ago

[flagged]

verdverm

a month ago

Did the AI tell you this is all legit?

I'm not going to make waste time verifying some random on the internets idea that they solved P=NP or hallucinations in LLMs

If you have, you'd be able to get the results published in a peer reviewed forum.

Start there instead of "I'm right, prove me wrong"

Have you built lt the thing to know it actually works, or is this all theory with practice?

Show us you are right with implementation and evaluation

verdverm

a month ago

> Don't be a troll. Prove me wrong. Run the code.

There is no code in the repo you linked to, what code am I supposed to run?

This just looks like stateful agents and context engineering. Explain how it is different