While I understand the outrage and fully understand it, I'd like to offer another reason why child faces are in use - sex trafficking. My previous research was in face recognition and mostly in artificial face aging algorithms. One of the trouble areas in that domain was predicting how child would look after X years. This is due to the elasticity of child faces. The DOD offered grants for this type of research to help thwart child abduction and sex trafficking.
Obviously this was before all of these types of stories began to emerge, but I like to bring this up as an opposing ethical debate.
Is sex trafficking of children really a significant problem? (Yes, everyone has heard of the Dutroux case, but the depths of that won't be plumbed by face aging algorithms.)
Altogether the line of reasoning sounds like the usual hype you put into grant applications. Your line of work is highly strained hydrocarbons, so when you apply to the Department of Energy you say in the distant future you won't synthesize 100 mg, you'll prepare hundreds of tons to put it into a rocket to increase its payload.
I think by now it's pretty much obligatory that any defense of mass surveillance will at least mention one of child sex trafficking, drug dealing or terrorism.
Every article I read about encryption regulation is like this. "Pedophiles and terrorists", every single time. What's frightening isn't the argument itself but how it instantly shuts down any debate. Politicians have found the perfect weapon to justify any law.
I would think that at least on HN we're more sophisticated than that. Just because child trafficking, drug dealing and terrorism are abused by politicians to excuse mass surveillance, doesn't mean they aren't real problems or that surveillance tech isn't able to help with them. Reasonable people consider trade-offs, and not cast everything in light of "sides" and "defending" yours.
1. Easy to use and effective encryption for all people, even pedophiles and terrorists.
2. Backdoored untrustworthy encryption governments will almost certainly abuse in order to surveil entire populations.
Number 1 sounds like a very reasonable principle to have. Governments and police forces have more than enough tools at their disposal to fight crimes. WhatsApp uses end-to-end encryption and my country's police forces routinely catch child molesters that use it. They do not need and should not have a "show me everything there is to know about this guy" button that they don't even need a warrant to press.
Certain things should be inviolable, even to the state, even with a warrant.
It's not necessarily that the child was trafficked as a child. It could be that the child went missing/ran away, lived on the street or in some other bad situation for a while, and then at some point was trafficked. Or it could even be that the child was sent to a relative in another country to grow up there, the relative never took any pictures of them, they grew up there, and then were conned into some trafficking situation as an adult.
Either way, the goal would still be to identify an adult victim of trafficking by the last available picture of them, which was of them as a child.
> In 2001, UPenn released a study conducted in 17 cities across the United States [in which] they estimated as many as 300,000 American youth may be at risk of commercial sexual exploitation at any time. However, the actual number of children involved in prostitution is likely to be much smaller: over 10 years only 827 cases a year had been reported to police departments. https://en.wikipedia.org/wiki/Sex_trafficking_in_the_United_...
Estimates differ wildly because many feel actual reports dramatically undercount reality. IMO even one is too many, but exaggerating does more harm than good.
Is sex trafficking of children really a significant problem?
You know I think there's a middle ground between being adamantly opposed to mass surveillance and taking child sex trafficking seriously. I personally know people (plural) that this has happened to and it's pretty irritating to see that issue cavalierly dismissed for argumentative purposes or tech cred points.
So... people specifically opted in to release their photos under a Creative Commons license, and now they're outraged that people are using the photos under the terms of the license?
And some of them chose a license that required attribution, and now they're outraged that the photos are attributed to them?
> By law, most Americans in the database don’t need to be asked for their permission — but the Papas should have been.
> As residents of Illinois, they are protected by one of the strictest state privacy laws on the books: the Biometric Information Privacy Act, a 2008 measure that imposes financial penalties for using an Illinoisan’s fingerprints or face scans without consent. Those who used the database — companies including Google, Amazon, Mitsubishi Electric, Tencent and SenseTime — appear to have been unaware of the law, and as a result may have huge financial liability, according to several lawyers and law professors familiar with the legislation.
IANAL, but I would be surprised if attaching a Creative Commons license to a photo qualified as consent under regulations about use of biometrics.
Read the article more closely - at least one person they interviewed specifically assigned it the non-commercial Creative Commons license which means pictures shouldn't be used for things like what mega face is doing (clearly for commercial purposes!)
They're using it for academic research, which is noncommercial by some definitions. The companies that used it only used it to participate in the competition.
Most people are not legal analysts. Your snarky characterization of this overlooks the fact that what can be anticipated at the time consent is given may be overtaken by technological developments, that corporations often structure their activities legally so that 'research' is non-commercial even if it is intended to be used later for commercial activity, and that it's polite to ask before doing something you might hesitate to grant consent for yourself.
People need to be able to say: "oops, undo this permission thing that I clicked OK to, as I now better understand the ramifications of what I clicked ... and I don't like it"
There's also a lot of rules to try and make sure the person really understands what it is they are signing up for, with mandated forms etc. And then as you say there's still a cooling off period.
Do you say the same of yourself when you sign up for a random online service and blindly check the “Accept TOS” checkbox?
Contracts and licenses are sometimes tricky to unequivocally enforce when one party entered into the contract or granted a license by mistake, unknowingly, etc.
On Flickr (where the photos came from), the default license is "All Rights Reserved". To apply a Creative Commons license, you need to go into a menu and select the license you want from a list. So that's why I said they "specifically opted in".
Ok Bill, for 4 hours a day you're going to work on this non-commercial academic open github repo and push your things in there and then the next 4 hours you're going to pull it into our for-profit solution where you will take things further. remember if you think of anything that will really be of benefit to the company don't do it until the second 4 hour slot!
See, I know this is the case but I think that people intended "you can use this for reasonable causes". Like, if you lend your buddy your car there's an implicit "don't Rob the bank" in there. Like if you CC release a picture with your child visible in the background and someone cut and paste that kid's photo on a naked body you'd be like "what the fuck, man?!" There's an implicit impression of reasonableness that isn't necessarily enforced by the licence.
It's a pity, I think if these fears propagate there'll be less CC content.
Like most privacy issues the issue is scale. None of these people ever explicitly consented to the use of their data this way. Change is often sensible when encountering new information or understanding of an existing situation.
Hell, I can't believe you're expressing any confusion. This is a pretty simple abuse of trust, apparently attributable to Flickr/Yahoo. It's almost like it's completely normal to not base your emotional reaction on copyright licenses—the outrage is legitimate no matter how legally protected the abuser is.
Those CC licences are all about copyright. The article is mostly about privacy and data protection. If I take a photograph of another person I can give away my copyright but I can't give away the other person's privacy rights. If that person lives in the EU probably even they can't give away their privacy rights: permission to use personal data is revocable according to the GDPR, I think, in most cases.
I wonder of any of the people whose photographs are in the database have moved to the EU and whether that makes any difference.
It's a mistake to think that checking a box that says "CC" means you actually had permission to issue a license.
If you're selling stock photos, you need written consent for everything in the photo--artwork, brands, people. Interestingly, parents can give consent for their kids to be in a photograph, but these same identifiers can be used years later to identify the adults in the photos.
A related question is this: do pre-trained classifiers reveal any personally-identifying information? Or are they only statistics about many, many faces?
Obviously this was before all of these types of stories began to emerge, but I like to bring this up as an opposing ethical debate.