Hacker News new | past | comments | ask | show | jobs | submit login

This would make AI usage restricted to those who can afford a beefy PC, no?



Costs will come down, while capability will go up. I'm thinking of a some kind of box in every home, with special inference chips; not really a "PC".

In principle, every human should have access to the same level of AI. There could be one at the local library if someone can't afford one, or doesn't want one.


Access to local libraries isn't even a thing that everyone has access to.


Or it would create an incentive for reducing the cost of the hardware needed to run beefier AI models and passing that reduction down to consumers.


Which would still mean that those who can afford more/better hardware can therefore afford more/better AI. I don't think a solution lies down this path...


Is that really an issue? Eg it's the same with games, the better your hardware, the more likely you can play a new game at higher settings with better performance, with most cheaper devices outright unable to handle anything recent.

But if you wait a GPU generation or two, the performance has improved enough that you can do that with a much cheaper GPU.


There are already people working on distributed solutions to that problem. There are a lot of people working on these problems.


That would still be a way larger class of people than the top 1%. Even so, capabilities of consumer devices will rise for some more time.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: