Hacker News new | past | comments | ask | show | jobs | submit login

Code, model, data and under Apache 2.0. Impressive.

Curious how this was allowed to be more open source compared to Llama's interesting new take on "open source". Are other projects restricted in some form due to technical/legal issues and the desire is to be more like this project? Or was there an initiative to break the mold this time round?




LLMs are trained on the entire internet so loads of copyrighted data, which Meta can’t distribute, and is afraid to even reference


This argument doesn't make sense to me unless you're talking about the training material. If that is not the case, then how does this argument relate to the license Meta attempts to force on downloaders of LLaMa weights?


they're literally talking about the training material.


data is creative commons


Yeah, but there's a CLA for some reason. I'm wary they will switch to a new license down the road.


So get it today. You can't retroactively change a license on someone.


Yeah, but it's a signal they aren't thinking of the project as a community project. They are centralizing rights towards themselves in an unequal way. Apache 2.0 without a CLA would be fine otherwise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: