Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
dlkf
on Jan 14, 2024
|
parent
|
context
|
favorite
| on:
Coding Self-Attention, Multi-Head Attention, Cross...
It’s debatable to what degree ”attention” in LLMs relates to ”attention” in psychology. See Cosma Shalizi’s note on this
http://bactra.org/notebooks/nn-attention-and-transformers.ht...
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: