Hacker News new | past | comments | ask | show | jobs | submit login

Besides the "GPT-4 cost about $100 mil to train" number, everything else is still just a number you pulled out of your ass.

Why would you estimate that the $100 million training bill would require a billion in GPUs? That's kind of like saying getting a $100 million of water would take a billion in plumbing.

I don't know what the number is, but I'm not going to start just making them up.




Have you ever heard of a "back-of-the-napkin" estimate?

Why so much reluctance to even try to make a reasoned guess?


Because his "back-of-the-napkin" estimate that someone is going to spend $100 billion to train a "GPT-6" (whatever that means) is laughably bad. These aren't estimates, these are just uninformed guesses pulled from nowhere.


I provided my reasoning, and you just seem incapable of understanding it. Here, the information literally just wrote an article on $100 billion data center: https://www.theinformation.com/articles/microsoft-and-openai...

It seems the most ignorant also tend to be the most arrogant.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: