Hacker News new | past | comments | ask | show | jobs | submit login

There still are important distinctions. RNNs have constant memory while transformers expand their memory with each new token. They are related, but one could in theory process an unbounded sequence while the other cannot because of growing memory usage.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: