Hacker News new | past | comments | ask | show | jobs | submit login

The state of art neural net architecture, whether that be transformers or the like, trained on self play to optimize non-differentiable but highly efficient architectures is the way.



According to Hinton, before transformers were shown to work well, learning model architectures was Google's main focus




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: