One thing I think about: over time, more and more training will probably move closer to users devices.
- Client-side training carries a lot of financial advantages; you can push the cost of the silicon, storage, and electricity onto the user.
- There's privacy benefits, which while not a major driver in adoption is something people think about.
- Apple does this already. They're going to keep doing this. When Apple makes a decision, it instantly impacts a massive plurality of the human population in a way that no other company can; and, it tangentially influences other companies.
I think you're right that "inference costs will dominate" is a short-sighted take. But: I think the better point is to think about where training will happen. Nvidia is weirdly poorly positioned to have a strong hand in client-side training. They don't have a cost-efficient and electricity-efficient strategy in any of their products; except for Tegra, which has seen zero consumer uptake outside of the Nintendo Switch. There's no hundred-billion-dollar client side AI training strategy anywhere approximate to the RTX 3070 in my gaming Windows PC, that ain't happening. I'm doubtful they can make that pivot; there's a lot of entrenched interest, and legitimately great products, from the existing computer & smartphone manufacturers. Apple has their chips. Google has their chips, and a really strong relationship with Samsung. Microsoft will be an ally, but they have very little power toward convincing their Windows users that a $1400 laptop is better than a $800 one because it has local AI training capability.
But, I mean: server-side training is still going to be huge, and Nvidia will still be an extremely successful company. Its just when you consider their percent ownership of the net total of all AI training that will happen in 2030; its going to drop, and the biggest factor behind that drop isn't going to be AMD; its going to be client-side training on chips made by Apple, Google, and others.
- Client-side training carries a lot of financial advantages; you can push the cost of the silicon, storage, and electricity onto the user.
- There's privacy benefits, which while not a major driver in adoption is something people think about.
- Apple does this already. They're going to keep doing this. When Apple makes a decision, it instantly impacts a massive plurality of the human population in a way that no other company can; and, it tangentially influences other companies.
I think you're right that "inference costs will dominate" is a short-sighted take. But: I think the better point is to think about where training will happen. Nvidia is weirdly poorly positioned to have a strong hand in client-side training. They don't have a cost-efficient and electricity-efficient strategy in any of their products; except for Tegra, which has seen zero consumer uptake outside of the Nintendo Switch. There's no hundred-billion-dollar client side AI training strategy anywhere approximate to the RTX 3070 in my gaming Windows PC, that ain't happening. I'm doubtful they can make that pivot; there's a lot of entrenched interest, and legitimately great products, from the existing computer & smartphone manufacturers. Apple has their chips. Google has their chips, and a really strong relationship with Samsung. Microsoft will be an ally, but they have very little power toward convincing their Windows users that a $1400 laptop is better than a $800 one because it has local AI training capability.
But, I mean: server-side training is still going to be huge, and Nvidia will still be an extremely successful company. Its just when you consider their percent ownership of the net total of all AI training that will happen in 2030; its going to drop, and the biggest factor behind that drop isn't going to be AMD; its going to be client-side training on chips made by Apple, Google, and others.