Hacker News new | past | comments | ask | show | jobs | submit login

Power density tends to be the limiting factor for this stuff, not money. If it's 30 percent slower per watt, it's useless.





The ratio between power usage and GPU cost is very, very different than with CPUs, though. If you could save e.g. 20-30% of the purchase price that might make it worth it.

e.g. you could run a H100 at 100% utilization 24/7 for 1 years at $0.4 per kWh (so assuming significant overhead for infrastructure etc.) and that would only cost ~10% of the purchase price of the GPU itself.


Power usage cost isn't the money but the capacity and cooling.

Yes, I know that. Hence I quadrupled the price of electricity or are you saying that the cost of capacity and cooling doesn't scale directly with power usage?

We can increase that another 2x and the cost would still be relatively low compared to the price/deprecation of the GPU itself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: