Hacker News new | past | comments | ask | show | jobs | submit login

Sure, but when one 12gb GPU costs ~$800 new (e.g. for the 3080 LHR), "a couple of dozens" of them is a big barrier to entry to the hobbyist, student, or freelancer. And cloud computing offers an alternative route, but, as stated, distribution introduces a new engineering task, and the month-to-month bills for the compute nodes you are using can still add up surprisingly quickly.



We are talking groups, not individuals. I think it is quite possible for couple of hundreds of people to cooperate and train something at least as big as LLaMa 7B in a week or two.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: