I always wonder how sites like these deal with stuff like potentially hosting child porn/other content that is illegal to possess. Does DCMA safe harbor stuff cover it? (don't see a DCMA notice on the site)
You can leave out the 'potentially'. I shut down a file sharing service specifically for that reason, it is just about impossible not to become a vector for the transmission of illegal or objectionable content. DMCA does not cover it, because the DMCA is about copyright, not about content that is illegal to possess regardless of how you got it. That said, the police (at least, the police here) is more than happy to work together with you to help keep the CP peddlers out but it's a losing battle, there are a very large number of them and you'll be hard pressed to work 24/7 all year long and since you can upload encrypted files you can't be sure of what's being transferred anyway.
Which is a good thing, after all what is being uploaded is strictly speaking none of my business but it automatically makes you complicit in roughly the same way that running a TOR exit node will make you complicit and I don't want any part of that. I'm perfectly ok with setting up useful services at very little or no fees at all but I refuse to become an unwitting and unwilling partner in other peoples illegal dealings.
We're lucky that some people are fine with becoming an unwilling partner since without them no internet, no phone, no email, no snail mail, no roads, no electricity, no open source software, no nothing. It's a sad fact of life that pretty much every useful technology, service or piece of infrastructure will be used for illegal dealings of some kind.
It's a matter of proportion to me. If a service is used predominantly to facilitate illegality then I see no reason to continue to run it.
The internet, phones, email, regular mail, roads, electricity, software and on on have predominantly good and productive uses. File sharing websites attract percentage wise more bad than good, at least more bad than what I'm comfortable with.
So in the end that's a moral call and in this case my decision was to shut it down because of the types of crimes involved and the number of criminals on the service totally outnumbered the 'good guys' and our ability to deal with the assholes was limited.
It's obvious that you're trolling, but just in case you're not: Eventually you can expect a push against cash in the name of anti-terrorism and anti-crime campaigns. Obviously those will be founded in bull-shit but that's not going to stop it from happening. Whether or not it will succeed is mostly a matter of how it gets sold to the public.
As for how it is possible for all of the available currency to be contaminated with traces of drugs and yet that this does not need to prove that all (or even a majority) of the cash transactions done are drug related, I'm sure you're smart enough to figure that one out for yourself, but evidence such as this will figure prominently in the kind of discussion that will visit us some years into the future.
False analogy. Creating a file upload service with no authentication or no resources
to keep it safe is like installing a fully loaded AK 47 at an intersection
and hoping people would just use it to learn how it works or use in an
emergency to shoot at criminals passing that intersection.
There are free file upload services and there are ways to upload
from command line(1).
This is a general problem with banning things instead of simply regulating them somehow. How would one scientifically study methamphetamine, for example, in a country where it is illegal to even possess it?
Without arguing for or against CP, what if I wanted to look up evidence that use of CP leads to increase or decrease of actual child abuse, without setting off red flags everywhere? As a Psych major, that sort of thing would interest me, for example.
It's actually quite well designed and questions such as these were definitely taken into account during the design phase.
I came up with a similar scheme about 15 years ago (as a result of operating that file sharing service) and proposed to the local LE that we set up a service where a 'fingerprint' of an image could be tested against known bad images, and if an image tested positive it would be flagged for review (and on an exact match it would be automatically banned).
The local law enforcement officer thought it was a great idea but it would never fly because even the hashes of the images were considered off-limits for sharing with others and they'd have to share their database of hashes with me if I were to set this up (free of charge).
Eventually MS came up with PhotoDNA and they're too big to ignore.
Agreed, but given the fact that this feeds into the legal system in a fairly direct way and that I'm only a 'one man shop' by their definition it makes good sense to insist on doing business with a larger party.
Where it goes haywire is that they then have to trust all the employees of that larger party as well but that's logic rather than CYA.
MSFT provides a set of sample images[1], some of which return as if they matched. The sample images are not child pornography, generally pictures of celebrities (which is really strange.)
I know. I helped them beta test that here in nl. Unfortunately it's not just still images. It's also videos, and encrypted still images and encrypted videos. Photodna comes into its own once you actually have an image.