And, sure enough, there's a new AI chip from Intellifusion in China that's supposed to be 90% cheaper. 48 TOPS in int8 training performance for US$140.[1]
I wonder what the cost of power to run these chips is. If the power cost ends up being large compared to the hardware cost, it could make sense to buy more chips and run them when power is cheap. They could become a large source of dispatchable demand.
Int8 training has very few applications, and int8 ops generally are very easy to implement. Int8 is a decent inference format, but supposedly doesn't work well for LLMs that need a wide dynamic range.
[1] https://www.tomshardware.com/tech-industry/artificial-intell...