I doubt the data OpenAI receives from the usage of ChatGPT is even remotely close to the amount of data Google or Facebook handles that could be used to build a better language model.
They have advantage over other AI startups, but I don't think they have much of an advantage over companies like Google and Facebook which both have massive amounts of data, money, and computing power. Which at the current state is the three most important things needed to build models that can operate at internet scale.
OpenAI can never run ChatGPT at the scale needed for massive usage. They would drown in debt. Hence why ChatGPT is extremely slow, often crashes, and is protected by a rate limiter.
Your email account must look verify different to mine, which consists almost entirely of marketing spam and GitHub notifications… I don’t recall the last time I sent an email.
> OpenAI can never run ChatGPT at the scale needed for massive usage. They would drown in debt.
They have access to Microsoft's billions and the compute powers of Azure. The other day there was a HN post [1] about Microsoft has build a top 5 super computer. That computer is probably the computer OpenAI used to train their model on.
They have advantage over other AI startups, but I don't think they have much of an advantage over companies like Google and Facebook which both have massive amounts of data, money, and computing power. Which at the current state is the three most important things needed to build models that can operate at internet scale.
OpenAI can never run ChatGPT at the scale needed for massive usage. They would drown in debt. Hence why ChatGPT is extremely slow, often crashes, and is protected by a rate limiter.