Hacker News new | past | comments | ask | show | jobs | submit login

I think this thread has lost the point being discussed (much like GPT's limited window!)--

I was replying to this:

> If I have a chat with the model and I ask it "what did we talk about thirty minutes ago" it's as clueless as anything.

This criticism is true, but would almost certainly be eliminated if the transformer window was increased go back far enough.

That wouldn't give it an arbitrary memory, sure. But the specific complaint that it forgets the current conversation doesn't require an arbitrary memory.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: