Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There may be bugs, but not hallucinations. Bugs are at least reproducible, and the source code of the verification tool is much, much smaller than an LLM, so has a much higher chance of its finite number of bugs to be found, whereas with an LLM it is probably impossible to remove all hallucinations.

To turn your question around: What if the compiler that compiles your LLM implementation “hallucinates”? That would be the closer parallel.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: