Hacker News new | past | comments | ask | show | jobs | submit login

Rephrase as "They are responsible for capturing 99% of the illegal content that is caught."

i.e., of the illegal stuff that goes through their system, some of it is captured, and the majority of that is caught by Facebook's own system.

Note, this article is recent and highly relevant: https://www.nytimes.com/interactive/2019/09/28/us/child-sex-...

> And when tech companies cooperate fully, encryption and anonymization can create digital hiding places for perpetrators. Facebook announced in March plans to encrypt Messenger, which last year was responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material

> Data obtained through a public records request suggests Facebook’s plans to encrypt Messenger in the coming years will lead to vast numbers of images of child abuse going undetected. The data shows that WhatsApp, the company’s encrypted messaging app, submits only a small fraction of the reports Messenger does.




> Note, this article is recent and highly relevant..

Talk about fear mongering, that article is horrible. It's the same old argument. "Child abuse is bad therefore you can't have any privacy", they position you as being against protecting children when your stance is actually pro security & privacy.


I think that reasonable people can hold both f̶a̶c̶t̶s̶ things in their heads at the same time

1. Privacy is important and good

2. It has almost certainly contributed to an explosion in the production and sharing of child pornography and abuse

The question is what is the moral way to reconcile the two, not to deny that either exists.


Very unconvinced it's lead to an explosion is child abuse.

Child abuse has always been happening (several of the older members of my family were abused), it just wasn't broadcasted. Even today I bet 99% of child abuse is never caught on camera. The increase from child porn is probably negligible.


If you actually read the article above, you can see that there's actually now lots of new evidence that viewing child porn makes people more likely to commit abuse, and that it seems to be a causal relationship, not a "marijuana is a gateway drug" relationship.


I couldn't find that in the article and would be immensely skeptical of such a claim (how on earth would you establish causation as opposed to common cause?).


Neither of those two things are facts. One is an opinion, and the other is speculation.


When did GP specify he was talking about facts?


I edited my comment from "facts" to "things," which was actually what I meant. Apologies. I'll clarify, so as not to disparage the person who replied.


Great points. It definitely seems to me that those two are fundamentally opposites. Suppose we champion (1), then with perfect privacy gives a lot of leeway for criminals to commit crimes such as those you stated and more without being able to detect that they did them.

However, without perfect privacy, every world citizen would be subjected to such monitoring, and we'll basically be exactly what 1984 is, with the metaphorical "telescreen" functionality spread across pretty much every device that's connected to a network.


They’re not complete opposites, since criminals don’t always have perfect privacy: they leave other evidence that can be discovered with a warrant. On the flip side, mass surveillance will just create selective enforcement.


The article never makes the argument you're objecting to. It merely states that E2E is likely to allow certain crimes to avoid detection. That seems both plausible as well as empirically correct considering their whatsapp/messenger comparison.


Exactly. I see it from those exact same perspectives.

The USA took steps to actually protect property in its founding, something the Supreme Court has interpreted to include privacy as well. From this perspective, the rights of citizens wither away when the government is allowed to take the smallest step towards mass surveillance. For that reason I am very much against it.

My other perspective is from a security standpoint. I believe that if companies are not doing everything in their power to protect themselves and their users from data loss / hacking / theft, they put everyone at risk. Intentionally lessening the security of a product at the request of the US government means giving a potential thief or hacker more attack vectors to exploit.


If this article is for real, then it's certainly not being pushed mainstream.

> The Times’s reporting revealed a problem global in scope — most of the images found last year were traced to other countries — but one firmly rooted in the United States because of the central role Silicon Valley has played in facilitating the imagery’s spread and in reporting it to the authorities.

Clearly the NYT is laying the problem at tech's feet, and we are the best ones able to thwart this. Many times I've scoffed at the government's continual removal of privacy, but this is the first time it's sunk in. Perhaps they have a case.

> 1998 - 3k cases. 2008 - 100k. 2014 - 1M. 2018 - 18.4M.

These figures from a total of 45M images flagged.

Again, is this for real, or this this propaganda?


These aren't new images, they're the same images that have been around the web for decades, that's why they are able to be flagged (via hashing, unless companies are training neural networks to know what child pornography is vs regular pornography). You need to benchmark those numbers versus the increase in regular images sent over the web in general. If it's increasing faster than that increase then there is an argument, if it's increasing slower then the opposite argument is the truth.


The NYT article refers to "both recirculated and new images" and states that "in some online forums, children are forced to hold up signs with the name of the group or other identifying information to prove the images are fresh." Isn't it just common sense that the proven existence of online paedophile communities that explicitly encourage the "production" of new child abuse material, in fact results in some net increase in the number of abused children, even if that increase is a small proportion of the total number of victims? Isn't it worth asking what can be done about it?


It is not surprising that creating automated reporting systems increases the number of reports. The article simply gives no way to make an informed opinion.

There are two things I would point out: 1) I would be surprisesd that the ease of communication the internet brought did not benefit to criminals 2) The article describes several problems that won't be fixed by encryption ban (e.g. the lack of means for report clearing houses) and also gives exemples of cases solved despite encryption. I would like to understand why encryption is described as the problem here.


I didn't read the article as blaming tech. The main takeaway for me was the astonishing under-resourcing of investigation and prosecution of these crimes. If the government can't be bothered to investigate 98% of the child sex abuse cases it already knows about (per Flint Waters's testimony -- which dates from a time when the problem was far less severe than it is now), why does it need to prevent encryption to possibly learn about a handful more cases?

Not to mention:

Congress has regularly allocated about half of the $60 million in yearly funding for state and local law enforcement efforts. Separately, the Department of Homeland Security this year diverted nearly $6 million from its cybercrimes units to immigration enforcement — depleting 40 percent of the units’ discretionary budget until the final month of the fiscal year.

Or in other words, in order to fight the imaginary rapists that Mexico is allegedly sending us, DHS is diverting money originally allocated to investigate actual child rapists.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: