Hacker News new | past | comments | ask | show | jobs | submit login

Currently AI folks create bigger models and feed larger training data sets, because we still don't know the efficiency limits. IOW, currently we can't accelerate the learning beyond a certain point with less training data. This is esp. true on LLM/GenAI space.

On the other areas where NNs are used, what I can see is training models with less data is not only plausible, but very possible, esp. in image processing, probably an image carries more information than a single sentence.

I think AI falls on the spectrum with a slight bias to the latter group. Because if you can shrink a model 10% and lose a month, you'd rather have a 10% bigger model now, and reap the money^H^H^H^H^H fame.

I don't know anything about a typical web app, because I'm not a typical developer developing web apps. :)




I just think we're talking about a completely different problem with AI optimization. There's $billions of effort and research that goes into AI optimization. Scaling model size and training set size happens because it's what the research and evidence tells us will improve model performance reliably. If we could reduce any of it for the same performance we would. Top model performance is an arms race and it's happening at every expense. The largest players are all shooting to beat GPT-4 or Claude Opus and achieve AGI (whatever that means).

This is very different than a program that requires zero research breakthroughs to dramatically improve and is simply slow and bloated because people have different priorities.


> If we could reduce any of it for the same performance we would.

Nope, because all of them are harder than just going bigger.

> Top model performance is an arms race and it's happening at every expense.

This is also what I said. "Growth (in model performance) is king, so quick and dirty (going bigger) beating harder optimization efforts".

> This is very different than a program that... [Snipped for brevity]

Again this is what I said by "it keeps development momentum". Yes people have different priorities. Mostly money and fame in this point.

So, we don't disagree a bit.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: