Hacker News new | past | comments | ask | show | jobs | submit login

Somewhat confusingly, it appears the tokenizer vocabulary as well as the context length are both 128k tokens!



Yup, that's why I wanted to clarify things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: