Hacker News new | past | comments | ask | show | jobs | submit login

OpenLLaMA is though. https://github.com/openlm-research/open_llama

All of these are surmountable problems.

We can beat OpenAI.

We can drain their moat.




For the above, are the RAM figures system RAM or GPU?


CPU RAM


Absolutely, 100% agree. I just wouldn't touch the original LLaMA weights. There are many amazing open source models being built that should be used instead.


> We can drain their moat.

I've got an AI powered sump pump if you need it.


They most certainly don't need / deserve the snark, to be sure, on hacker news of all places.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: