Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, I read "Attention Is All You Need", and I understand that the multi-head generative pre-trained model talks about "tokens" rather than language specifically. So in this case, I'm using "LLM" as shorthand for what OpenAI is doing with GPTs. I'll try to be more precise in the future.

That still leaves disagreement between Altman and Sutskever over whether or not the current technology will lead to AGI or "superintelligence", with Altman clearly turning towards skepticism.



Fair enough, shame "Large Tokenized Models" etc never entered the nomenclature.


Some terms I've seen used for the technology:

Big-Data Statistical Models

Stochastic Parrots or parrot-tech

plausible sentence generators

glorified auto-complete

cleverbot

"a Blurry JPEG of the Web" <https://www.newyorker.com/tech/annals-of-technology/chatgpt-...>

and just plain ol' "machine learning"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: