I have some experience helping previous jobs block bandwidth abuse from user uploaded content. That lead to inadvertently finding some pretty bad content. I would figure out some way to have someone else review the content or at least some kind of automated scanner you can use to pre-check the reported content before reviewing it yourself.
Some stuff is hard to unsee.
Edit: I wonder if a local LLM (to help with privacy concerns) would be a good option or not to at least identify anything obviously bad. Wish I had more concrete suggestions.
You absolutely have a point. But for me, I'm not sure how to balance privacy and safety. Is my service really private at all if I'm handing off user files to a third party to do who knows what to scan for bad content, and potentially risk users through false positives?
Edit: A local model could work, but that can be quite compute intensive and therefore expensive.
There's no balance to be had--you must prioritize legality over privacy. You will be storing CSAM if you don't do something. You may already be storing CSAM. This is no joke. This is real and something every image hosting site deals with. You need to take it seriously. This is a "you could go to jail" concern, not a "this project might not work out" concern. The ability to store and share media privately while knowing it won't be scanned for abuse, with a free tier that doesn't even require an email address to sign up, is begging to be used for CSAM and other illegal activities. That's the sort of site you'd set up if your explicit goal was to attract CSAM. MEGA offers a similar service and they are severely burdened with abuse.
I meant it only for the reported content so that is, to me, a proper balance because that's kind of your legal requirement[0] to take down content which is reported. But since that's ripe for abuse the proper way is to basically first hide the content, review+confirm it's bad, and then take proper action.
So I would try asking around or thinking of how best to handle the specific reported cases without exposing yourself too directly.
This is a legally...risky strategy. You built a cool thing but unfortunately when you put cool things online they get used for the worst possible purposes.
Yes, I understand there is an unfortunate risk. However, I oppose file scanning, and so do many users, as we've seen with the Apple scandal.
If any content is uploaded that violates the terms of service and is reported, they will be deleted as soon as possible, and that user will almost certainly be permanently banned.
The terms of service cannot shield you from federal law. Not sure if you're in the US or elsewhere but similar laws are prevalent around the world - in the US federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, and possession of child sexual abuse material (CSAM).
This is an issue that could ruin you. The only reason it hasn't already is that the service isn't big enough yet. You undoubtedly already have CSAM on your network and any reasonable person with experience online would expect that, which is an important standard for you to consider. You're starting your own projects online at 17. You will do a lot of cool stuff in your life. Don't let this kill that inertia.
Your site offers private hosting; how do you expect reports to happen? The people sharing the CSAM won't report it; they're the ones that want to abuse your site. They'll happily share the content privately among themselves and you'll never know until the police knock on your door. A reporting-based system only works if the images are public and available in a feed so that you can "crowdsource" your moderation. It doesn't work at all--not even a little bit--when the posts are private. I urge you, strongly, to reconsider your plan here before something bad happens. I don't get the sense that you grasp the seriousness of this concern.
Your terms of service shields you from nothing. It doesn't limit your liability here at all.
I don't believe it is feasible to run an image hosting service with your intended CSAM management plan.
Based on this comment thread, I fully expect your site will soon host CP/CSAM, if it doesn't already. Other image hosting services devote extensive resources to engineering a solution to this problem (one that is more robust than user reporting). I would not expect you will be able to avoid this work, and avoid liability.
Edit: I just noticed you're quite young. Congrats on all the great work. I think this CSAM thing could bring misery your way so I hope you find this comment helpful. Good luck out there.
Get ready for pull request from all kind of organizations. We had something similar a few years ago, when it became a bit populair, it got bombarded with all kind of illegal stuff like cp. We had a contact by the police to notify them when to report the illegal stuff. We stopped it because we had to implement continues monitoring of what was uploaded.