Hacker News new | past | comments | ask | show | jobs | submit login

Before ChatGPT was in beta, there were already models that fit into 2gb and smaller. They were complete shit, but they did exist.



I know but what's changing is that they aren't shit now. Not on par with GPT but getting much closer. Especially with a little massaging like Stanford has done.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: