They basically are MS by now. Everyone at Microsoft I work with literally calls it an 'aquisition'. Even though they only own a share. It's pretty clear what their plans are.
> Microsoft will reportedly get a 75% share of OpenAI's profits until it makes back the money on its investment, after which the company would assume a 49% stake in OpenAI.
49% isn't _just_ a share, it's a significant portion of the company.
Of course, and it's also not _just_ a share, which is the comment I was responding to. 49% of a company's outstanding public shares is a significant portion. It's 2% away from controlling, and a fun merger event.
> Or are they just operating at a massive loss to kill off other competition?
Bingo.
> the deal with Microsoft for cloud services making it cheap?
It should make it cheaper, but it takes time and engineers to migrate work load from AWS (which is reasonably adept at scaling) to azure, which is not.
I think they run k8s in something like 4k groups of nodes, which is spectacular, because k8s isn't really designed to do that. Running it at that scale is challenging because the traffic required to coordinate is massive (well it was last time I looked into it.)
Probably the first two, plus first-mover brand recognition. Millions of $20 monthly subs for GPT4 add up.
They might also be operating at a loss afaik, but I suspect they're one of the few that can break even just based on scale, brand recognition, and economics.
I haven’t heard any evidence that they have millions of Plus subscribers.
I’ve seen 100 to 200 million active users, but nothing about paid users from them. The surveys I saw when doing a quick google search reported much less than 1% of users paying.
There’s also just the benefits of being in market, at scale and being exposed to the full problem space of serving and maintaining services that use these models. It’s one thing to train and release and OSS model, it’s another to put it into production and run all the ops around it.
Probably some combination of all the above! I think 1 and 2 are interlinked though — the cheaper they can be, the more they build that moat. They might be eating the cost on these APIs too, but unlike the Uber/Lyft war, it'll be way stickier.
I think it's mostly the scale. Once you have a consistent user base and tons of GPUs, batching inference/training across your cluster allows you to process requests much faster and for a lower marginal cost.
Or is it the deal with Microsoft for cloud services making it cheap?
Or are they just operating at a massive loss to kill off other competition?
Or something else?