Hacker News new | past | comments | ask | show | jobs | submit login

> No large-scale email scraper has the budget necessary to run the content it scrapes through a LLM.

That's why it's awesome to be able to locally run open-source LLaMA! Not to mention that by running it locally OpenAI is not aware of your shenanigans.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: