Although there’s the Jevon’s Paradox possibility that more efficient AI will drive even more demand for AI chips because more uses will be found for them. But possibly not super high end NVDA chips but instead little Apple iPhone AI cores or smartwatch AI cores, etc.
Although not all commodities will work like fossil fuels did in Jevon’s Paradox. It could be the case that demand for AI doesn’t grow fast enough to keep demand for chips as high as it was, as efficiency improves.
> But possibly not super high end NVDA chips but instead little Apple iPhone AI cores or smartwatch AI cores, etc.
We tried that, though. NPUs are in all sorts of hardware, and it is entirely wasted silicon for most users, most of the time. They don't do LLM inference, they don't generate images, and they don't train models. Too weak to work, too specialized to be useful.
Nvidia "wins" by comparison because they don't specialize their hardware. The GPU is the NPU, and it's power scales with the size of GPU you own. The capability of a 0.75w NPU is rendered useless by the scale, capability and efficiency of a cluster of 600w dGPU clusters.
Although not all commodities will work like fossil fuels did in Jevon’s Paradox. It could be the case that demand for AI doesn’t grow fast enough to keep demand for chips as high as it was, as efficiency improves.