Hacker News new | past | comments | ask | show | jobs | submit login

And maybe the possibility of building more efficient hardware. Floating point units take up a lot of transistors (in the ~100K transistor range AFAIK), whereas boolean logic is tiny.



Going to 1-bit precision is probably overkill, but papers have shown how neural networks with 8 bits of precision and simple ALUs can give results equivalent to full FPUs.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: