Hacker News new | past | comments | ask | show | jobs | submit login

Of course, it's nothing else. Who could possibly believe that OpenAI and others would dump billions into development and training and aren't smart enough to figure out they could also do it with $500.



It's llama 3 training cost + their cost. Meta "kindly" covered the first $700M.

> We add a vision encoder to Llama3 8B


They didn't train the vision encoder either, it's unchanged SigLIP by Google.


“We finetuned billions of dollars of research by Google and Meta.”


While that may be true, the opposite has also happened to hundreds of companies in other areas:

https://news.ycombinator.com/item?id=39136472

Many companies also optimize for tools, like Python, that have boost productivity more than price/performance ratio. OpenAI had billions of other people's money. They might just keep using tools which worked before.

Lastly, there are tons of papers published on techniques that claim to reduce cost. Most of them aren't good. Their benchmarks aren't good. Even reviewing most of them is more time than a lot of AI researchers have. Those that make it to established communities usually have gotchas that come with the benefits. So, they could also simply miss a needle in a large haystack.

I think you're right that they'd be using whatever really worked with no loss in model performance. It's just that they might not for a number of reasons. The rational choice is for others to keep experimenting with those things in case they get a competitive advantage.


it would have been a lot cheaper for oai if they had access to llama3 in 2018


You have clearly not read the article. $500 is the cost of fine tuning.


Fair enough. Is it now safe to say that OpenAI could have done with a 8B model + $500 of fine tuning instead of running a (much) larger model on their GPU cluster?


Maybe they did


Who could possibly believe that OpenAI and others would dump billions into development and training and aren't smart enough to figure out they could also do it with $500.

People upvoting the post??

Not really sure? But PT Barnum said there's always a lot of them out there.

Pretty sure they mean fine tuning though?

But even that is total tripe.

These guys are snake oil salesmen. (Or Sylvester McMonkey McBean is behind it.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: