Hacker News new | past | comments | ask | show | jobs | submit login

I found a DGX H100 for $300k. The internet tells me such a server uses 10kw. I found a 30kw solar system with 90kw of lithium battery storage for $70k. At utility scale I don't think there's any risk of AI causing a problem to the grid when a dedicated supply of power for an AI server can clearly be purchased for less than 20% of the system cost. (Realistically I expect the lifespan of solar panels and the batteries is easily 2x-4x the lifespan of the server, so it's probably closer to 5% or even lower.)

Solar and batteries make power basically a solved problem. Relative to GPUs it's cheap and scalable.




You need to account for cooling that 10kW coming off the server.


Does that bump it up to 20kW? I'm not sure it really matters that much (nor does what the power generation method you use is.) I used solar and batteries because you can get them off-the-shelf in small quantities, but my main point was just to demonstrate that buying GPUs requires an incredible amount of capital. Powering them and cooling them requires some capital, but it's a fraction of the cost of the GPUs.


So, your battery is good for 9 hours of compute time. What happens if you have a couple of cloudy days in a row? And what's the insolation during the winter (in many places, it's 1/10th of the amount of summer sunlight)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: