Show HN: Turn Any ArXiv Paper into a 200-Page Prerequisite Reading Book

6 pointsposted 10 months ago
by melvinmelih

3 Comments

HenryBemis

10 months ago

I was wondering (also to billconan's comment), does anyone check that the LLM doesn't hallucinate? I mean, if one reads a study for the fun of it, go crazy! If someone finds a study about diabetes, and decides to follow the 'diet' that the book suggests, it can go very south - very fast.

melvinmelih

10 months ago

This is a valid concern and while there are no guarantees that the AI won't hallucinate (hence the disclaimer in the book, especially for medical topics), I try to minimize it by pairing the writing with real-time research from Perplexity, so at least it is (or should be) based on verifiable information.

billconan

10 months ago

this will be very useful for me, if the quality is good.