Hacker News new | past | comments | ask | show | jobs | submit login

Anyone who has to work in this ecosystem surely thinks this is a naive take



For someone who doesn't work in this ecosystem, can you elaborate? What's the real situation currently?


Nvidia CUDA was first to market, easier to work with that OpenCL which was the only competition for the first decade then abandoned. Because of this then all the people serious about this are using Nvidia hardware therefore all the code is written for Nvidia hardware.

Only way I could see AMD making inroads if they were willing to provide power of the level Nvidia puts in a data center at consumer prices and relaxed licensing to justify retooling the entire ML chain to work on a different architecture.

Geohot has documented his troubles trying to go all in on AMD and he's back on Nvidia now I believe.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: