Hacker News new | past | comments | ask | show | jobs | submit login

> Today building a gazillion param model is only possible by the ultra rich

True, but in 5 years there’ll be an open source equivalent running on commodity GPUs.




No there will not. Yes, you may have a GPT-4 substitute running on your 3090, but the billionaire will have GPT-666 or whatever running on a supercomputing cluster guzzling a significant fraction of the world's data every day and playing hi frequency idea trading on a scale never before seen.


I hope so and there is some kind of Moore's law for memory - especially gpu memorh. Even the mighty h100 has something like "only" 100gb? As model sizes grow exponentially memory sizes don't seem to be catching up. But yes hope these do get commoditized soon.

What I feel scared about is the economies of this. The so called democratized/commoditized chips are still controlled by Nvidia. So why Nvidia would give that up is not clear to me.

One thing I really wish could happen is the equivalent of seti project for model training and inference! (No btc/crypto please).


"The cat is out of the bag" so to speak.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: