Hacker News new | past | comments | ask | show | jobs | submit login

"A lot of money" is a lot less money per user than to buy 64GB RAM to run an inferior model locally + energy and opportunity costs. The OpenAI APIs are super cheap for a single user needs. I expect them to be at least close to breaking even with their APIs pricing.



> "A lot of money" is a lot less money per user than to buy 64GB RAM

if OpenAI isn't able to get couple hundred bucks over the typical lifetime of a computer it means the added value they provide is very low (like several times less than Spotify or Netflix for instance), meaning they'll never be “the next Google”.

And if they are it means it make sense to buy it once instead of paying several times the price through subscription.

> The OpenAI APIs are super cheap for a single user needs. I expect them to be at least close to breaking even with their APIs pricing.

“Close to breaking even” means the price you pay is VC-subsidized, the expected gross margin for such kind of tech company is more than 50%. Expect to pay a lot more if/when the market is captive. And this will scale linearly with your use of the technology.

> energy and opportunity costs

What opportunity cost?


> Expect to pay a lot more if/when the market is captive.

Yes, this is a possibility but cloud computing became a commodity.

But I see why people would pay to have their own private and unfiltered models/embeddings.

> if OpenAI isn't able to get couple hundred bucks over the typical lifetime of a computer it means the added value they provide is very low (like several times less than Spotify or Netflix for instance), meaning they'll never be “the next Google”.

They don't have to worry about this today.

> What opportunity cost?

You could utilize the money and the time spent to do other things.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: