Hacker News new | past | comments | ask | show | jobs | submit login

Isn't access to massive datasets and computation the moat? If you and your very talented friends wanted to build something like GPT-4, could you?

It's going to get orders of magnitude less expensive, but for now, the capital requirements feel like a pretty deep moat.




How do you know massive datasets are required? Just because that’s how current LLMs operate, doesn’t mean it’s necessarily the only solution.


Then the resources needed to discover an alternative to brute-forcing a large model are a huge barrier.

I think academia and startups are currently better suited to optimize tinyml and edge ai hardware/compilers/frameworks etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: