Besides the "GPT-4 cost about $100 mil to train" number, everything else is still just a number you pulled out of your ass.
Why would you estimate that the $100 million training bill would require a billion in GPUs? That's kind of like saying getting a $100 million of water would take a billion in plumbing.
I don't know what the number is, but I'm not going to start just making them up.
Because his "back-of-the-napkin" estimate that someone is going to spend $100 billion to train a "GPT-6" (whatever that means) is laughably bad. These aren't estimates, these are just uninformed guesses pulled from nowhere.
Why would you estimate that the $100 million training bill would require a billion in GPUs? That's kind of like saying getting a $100 million of water would take a billion in plumbing.
I don't know what the number is, but I'm not going to start just making them up.