Hacker News new | past | comments | ask | show | jobs | submit login

Just because AI-specific GPUs are in short supply today doesn't make your argument stronger.

You are complicating the comparison with the incentives for Bitcoin mining. It is irrelevant to the fact that it is today consuming more power than ever due to the increase in demand of the commodity.

In the same comparison, the same increase in demand is what's going to 3x today's energy use for AI and probably even surpass Bitcoin's in the future.




In Bitcoin, the computation being challenging is key to the product.

In AI, it's an inconvenience to be optimized away if at all possible. If someone figures out how to make AI models run on an iPhone with zero battery use, everyone goes "yay!", not "oh no our entire financial system collapsed".

AI and Bitcoin may both use a lot of power, but there's a vast difference in whether they want to. (There's also a difference in how useful they are, but that's a separate argument.)


> If someone figures out how to make AI models run on an iPhone with zero battery use, everyone goes "yay!"

This is what I call wishful thinking.

I never said Bitcoin and AI have similar energy use patterns which is the argument you are keen to make. My original argument was that AI usage is going to be worse than Bitcoin in the long run. With some AI farms using ~3000W per server, it's a no-brainer where this is all headed. Add in the fact that AI has so far proven to be more popular at launch than Bitcoin.

Just because you can optimize models doesn't mean that people won't want or need bigger better models to train/use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: