Hacker News new | past | comments | ask | show | jobs | submit login

I wouldn't be surprised if this causes hardware startups to pop up that build accelerator cards tuned for this architecture. It seems stupidly simple to do inference in hardware, and with most of the training being quantized as well you might even be able to provide speedups (and energy savings) for training with reasonable investment and on cheaper processor nodes than what Nvidia is using.

Sure, Nvidia might eat their lunch in a couple of years, but bitcoin ASICs prove that you can have a niche producing specialized processors, and VCs would probably jump at the thought of disrupting Nvidia's high margin business.




There's like a million startups promising analog / bit-level computation, inference-only, cheap computation.

There's rain.ai, d-matrix, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: