Hacker News new | past | comments | ask | show | jobs | submit login

It's trivial to fine tune llamma to be NSFW if that's what you want.

But there's an entire universe of much more interesting apps that people don't want NSFW stuff in. That's why most foundation models filter it out.




Anything involving llamma is not trivial - if I can't do it on my phone through a website, then you shouldn't expect anyone else to be able to do it. If your instructions involve downloading something, or even so much as touching the command line, it makes it a non-starter for 95% of users.

Get something on the level of character.ai and then you can tell me it's "trivial".


The context of this thread is a company spending $4B with a 4 year plan to build foundation models. One could do what I suggested in between days and months of work for a single person, including building a user-friendly front end.

In the context of this thread it is trivial.


I don't think that's the reason. You wouldn't get anything "NSFW" if you don't ask/prompt for it.


the point is though the market potential is huge. and it would be a way to grow fast with cash flow. as a side effect you would probably develop the best NSFW filter in the world also.


> way to grow fast with cash flow.

Until the US payment processors cut you off, then you go bankrupt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: