Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The prompt is unique but the tokens aren't.

Type "owejdpowejdojweodmwepiodnoiwendoinw welidn owindoiwendo nwoeidnweoind oiwnedoin" into ChatGPT and the response is "The text you sent appears to be random or corrupted and doesn’t form a clear question." because the prompt doesnt correlate to training data.

 help



> The prompt is unique but the tokens aren't.

The tokens aren't unique, but the sequence is. Every input this model sees in unique. Even tokens are not as simple as they seem

If you type "ejst os th xspitsl of fermaby?" in ChatGPT it responds with

> It looks like you typed “ejst os th xspitsl of fermaby?”, which seems like a garbled version of:

> "What is the capital of Germany?”

> The capital of Germany is Berlin.

> If you meant to ask something else, feel free to clarify!"

edit: formatting


The prompt does correlate to its training data. In this case, since you sent random text, it generated the most likely response to random text.

Or because the text you send was random and doesnt form a clear quesiton?

...? what is the response supposed to be here?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: