Hacker News new | past | comments | ask | show | jobs | submit login
How Facebook and Instagram became marketplaces for child sex trafficking (theguardian.com)
49 points by sandebert on April 27, 2023 | hide | past | favorite | 77 comments



Crimes are committed all the time using phones and yet, we do not allow phone companies to monitor, record and report the content of pur phone conversations. And they can't just add a line in a terms of service, they have no legal right to do it. (Ignore the issue of Metadata, I mean the audio portions of our calls)

So why have we accepted it is ok for all of our private electronic conversations to be monitored then?

These are horrific crimes descrbied here but all the solutions proposed reinforce the idea and start from the presumption that every SV company has a legal right to snoop on every conversation you have on their platforms.

I don't think you can have it both ways. If you routinely spy on your customers to sell ads, manipulate their behaviors and build sophisticated AI to make inferences about what every keystroke and mouse movement means, then you don't get to turn your head and pretend the crimes you witness on your platform aren't real.


>Crimes are committed all the time using phones and yet, we do not allow phone companies to monitor, record and report the content of pur phone conversations. And they can't just add a line in a terms of service, they have no legal right to do it. (Ignore the issue of Metadata, I mean the audio portions of our calls)

Are you sure about that? It is my understanding all telco traffic are monitored. Some countries may have laws that prevent real time conversation/sms to be look at for a few months.


Correct, I cant speak for all the world. In the United States, only a party to a phone conversation may record that call. Some states have stronger requirements than that, requiring all parties to consent.

The phone company can't listen to your phone calls and there is definitely no public expectation or journalists demanding that AT&T and others do something about preventing crimes that used a phone by recording all calls and passing them through a Crime Detector (TM).


Ignore the illegal spying by US intelligence agencies on American citizens on American soil (past and/or whatever is still ongoing). Is this an example where the US may "let" a friendly nation spy on US citizens and then just gets that information from them?

So rather than the NSA hoovering up US phone call content, which they almost certainly have the capability of doing, they just let the UK do it for them?


That happened. Can’t recall the source however the UK was spying on US citizens and passing to the US through a “Five-Eyes” agreement.


Yes Five Eyes is what I was thinking of just couldn't remember the name.


The phone company can't listen to your phone calls

Perhaps not explicitly legal but calls are monitored all the time after switch upgrades. We even applied US specific patches that disable the operator override tone that lets you know someone is on the call. I had to do this after every switch upgrade. One time that specific patch flubbed in a non obvious way and when I jumped onto a random call the people said, "Is someone there? Who is on our call?" and I disconnected.

For what it's worth, my boss had an arrangement with the three letter agencies to give them unfettered access to all the switches and we would even enable non-logging test mode to prevent any of their commands from being logged. My boss and most of his direct reports were not US citizens and one of them openly hated Americans so they did not really care. This was a couple decades ago so maybe by now everybody follows the spirit of the law.


Wire taps are a thing. I don't know the law around it, but it likely requires approval from a court.

https://en.wikipedia.org/wiki/Wiretapping#United_States

Otherwise, telecom companies keep records of all calls that take place on their network - the phone numbers, time, duration, etc.


This is an excuse that makes no sense. As the article makes clear, this is not happening only on private chat:

> It could, for instance, be investing more to develop better tools to “flag suspicious words and phrases on unencrypted parts of the platform – including coded language around grooming,” he said. “This is, fundamentally, not a technological problem, but one of corporate priorities.” (There is a separate debate about how to handle encryption. Meta’s plans to encrypt direct messages on Facebook Messenger and Instagram has recently drawn criticism from law enforcement agencies, including the FBI and Interpol.)


None of this is Meta's problem though. Requiring platforms to monitor user activity is unreasonable.

The main issue here is one of education. By 12 years old, a person should have learned enough about sexual exploitation in school that if a stranger asks them for nude photos, they would call the police.


I think the early generation of internet kids had trouble with this and other things like the permanence of Facebook, etc. Today, my kid and her friends are highly sensitive to "creepers," who might be lurking and their tactics. They are well aware of the pitfalls of posting on social media and all that. It seems society has adapted.


> It seems society has adapted.

Apparently not, going by the article.


> Requiring platforms to monitor user activity is unreasonable

That is absurd. Facebook spends billions algorithmically sorting their post, and they could spend a tiny fraction filtering them, just like they already do when required by law.


Facebook and Instagram do a significant amount of automated filtering. People from scammers to spammers to pimps have incentives to try to bypass the filtering. It's a cat and mouse game Meta has a significant incentive to stay ahead in; everyone from advertisers to app store owners to users will punish them for having too much of any of that, or even legal adult content.

Policing is the job of the police. If criminals are advertising sex with children on Instagram to the point that child molesters can find it, police can find it too. They can then use the low-tech method of posing as buyers, witnessing illegal acts, and arresting the offenders.

Instagram should provide assistance when legally required and technically possible of course, but should not be expected to perform law enforcement functions. That's not just because it would be some sort of undue burden on the huge corporation operating it, but because it lacks the checks and balances we apply to police agencies.


> phone conversations

What if you could call your local cell carrier and request SMS pictures of "underaged local girls" along with their phone numbers?

I think the public structure of Instagram is horrific, especially since they apparently allow unsolicited DMs. Even "private" chatrooms like Discord try to block random DMs by default.

And that is setting aside the public marketing of the girls, as nl pointed out. This is way beyond a DM privacy issue.


> But the vast majority of the content that Meta reports falls under child sexual abuse materials (CSAM) – which includes photos and videos of pornographic content – rather than sex trafficking. Unlike with child sexual abuse imagery, there is no legal requirement to report child sex trafficking, so NCMEC must rely on all social media companies to be proactive in searching for and reporting it. This legal inconsistency – the fact that child sexual abuse imagery must be reported, but reporting child sex trafficking is not legally required – is a major problem, says Staca Shehan, vice-president of the analytical services division at NCMEC.

That says it all. When legally required, Facebook is aggressive and expedient about enforcement. Otherwise, they are not.


Not to absolve Facebook or instagram but it’s clear that the entire model of unrestricted access to global internet is ripe for abuse of vulnerable demographics. Also see scamming elderly with fake profiles of relatives claiming they’ve been kidnapped and so on.

I wouldn’t claim to know the solution. But our starting point is platforms built to provide strangers with unsupervised digital time spent with other strangers. One of those parties may be perverts and predators and the other may be children or teenagers.

You could destroy Facebook and instagram and I guarantee this moves to Snapchat, to TikTok, or to whatever other platforms fill the void.


>One of those parties may be perverts and predators and the other may be children or teenagers.

I don't know if it matters, but this is generally replicated in a public park, playground, library, store, subway, etc. The people on social media are the same people walking around everyday.


Kids are generally monitored in parks and such. You cant easily creep on them without another adult or kid noticing.

And Instagram is like a mall for a predator. Its literally just walls of selfies to anonymously scroll through, and then an instant way to DM anyone you want. Thats not like a park at all.


Incidentally, in the late 20th century we had a moral panic about exactly this fact and stopped letting kids outside the home unsupervised (in the US anyways)


Having traveled to a 3rd world country, I can easily buy weed (illegal there) from many Instagram accounts. So I totally believe that Instagram and FB are often used to facilitate illegal activities.

Also, Telegram is another - probably even more on Telegram.


Facebook is a "marketplace" for all sorts of crimes by virtue of having three billion users. Which is about 37% of the world's population. The fact that some horrible crimes slip through the cracks, as is inherent to every large population, is no excuse to treat that 37% as criminals.


> This legal inconsistency – the fact that child sexual abuse imagery must be reported, but reporting child sex trafficking is not legally required – is a major problem, says Staca Shehan, vice-president of the analytical services division at NCMEC.

This seems to be a pretty significant legal oversight.


This seems like an obvious first step to combat the problem. Much of the activity is public too, it's not all in private messages. The article does make a good point that it's much easier to detect CSAM than trafficking, but still, I'm sure Meta and other platforms could be doing so much more to detect this activity and reporting it.


Facebook has a history of letting crime groups operate even if they are obvious and don't even try to avoid detection: https://krebsonsecurity.com/2019/04/a-year-later-cybercrime-...

Is it really surprising though? Criminals, including pimps and child rapists, still watch ads, "engage" with the platform and increase the user numbers, and even encourage their peers to also join/engage with the platform. For something like Facebook surely it's a no-brainer to not miss out on all this "engagement".


This is nonsense.

It is obvious to me that any bad PR is orders of magnitude worse for the business than the potential benefit of having even hundreds of thousands of criminals "engaging" on the platform.


If bad PR was a problem surely you can start by actually actioning reports on obvious cybercrime pages instead of saying that they "reviewed my reports but that none of the groups were found to have violated its standards".

I stand by my argument. That disgusting company doesn't care about crime and malicious activity because it's cheaper to let it happen and simply clean up the few PR incidents with some lies which people (such as you) will happily believe.


Look, it's clear you don't like Meta. Frankly neither do I. But the reality is that the article at the URL that you linked in your original comments contains evidence that supports my point that PR is all that matters.

Namely: "KrebsOnSecurity later found that reporting the abusive Facebook groups to a quarter-million followers on Twitter was the fastest way to get them disabled."

Frankly it does seem that you don't appear to have read or understood the information presented in the link you posted.


I probably left too many comments in this thread, but I just remembered something. WTF happened to FOSTA (Fight Online Sex Trafficking Act) and SESTA (Stop Enabling Sex Traffickers Act)? The article doesn't make a single mention of either of these acts. Is Meta somehow exempt from these laws?


why did a 12yo child have unmonitored access to the internet?


Most 12yo children have unmonitored access to the internet. Many parents have to work and can't monitor their children 24/7. Children are naturally curious, so even if you setup parental controls and such, they'll find ways around it, through friends/classmates and however else. Many other parents are simply lazy/irresponsible and will give their kids a phone/tablet so they can binge watch Netflix unbothered (most parents I've known fall under this category).


If you think your 12y/o doesn't have unmonitored access to the internet, you're very naive. Especially if you decided to block social media, kids will not have any of that.

I used to be a boy scout leader. 10y/o kids routinely broke any and all blocking their parents put on. If they couldn't do it, they bought an old cheap smartphone and went to their friends/McDonald's with it.


It's unusual if they didn't have unmonitored access to the internet. Helicopter parenting takes a lot of energy and technical parents can only do so much. A 12yo without a phone is a social pariah.

What is odd to me in the story was the absence of a moral compass of the child and lack of knowledge of the internet as a potentially hostile place.


>What is odd to me in the story was the absence of a moral compass of the child and lack of knowledge of the internet as a potentially hostile place.

The term I see being used again and again is "grooming". I also saw another story that mentioned one person who was convicted, and reportedly contacted THOUSANDS of children online (using many fake profiles, of course). While most children probably aren't being pulled in, it seems like they're "fishing" for the vulnerable few that they can manipulate into being trafficked.

A public education campaign, especially in schools, could do much to reduce the problem, but still there will always be those children who are vulnerable. We should still try to educate children and parents on the dangers, but like climate change, there is no one solution to the problem.


Well said. I wonder if there is a way to target a particular profile of child who might be vulnerable and direct education to those areas. Maybe a bad idea but for example DM'ing using the methods of those convicted.


That's what happens when parents restrict kids instead of teaching them about the dangers.


Yes, just like the “Just Say No” campaign completely stopped children from using drugs and parents telling children to “save themselves until marriage” stop children from having sex.


That confirms my point - I'm saying that restriction, blocking access or making it a tabu never works, it does the exact opposite (and worse, because it builds a sense of urgency to have/experience the thing).

Instead of forbidding it, you have to teach the children about the actual dangers of doing the thing and how to do it safely, how to limit the damage and what are the limits they shouldn't cross - and the reasons for these limits.

Telling them "just say no" or "you have to wait until marriage" only makes them even more curious, and in many cases they will absolutely overdo it once they finally are out of the reach of their parents.


I agree with both you and the parent. Early on it's good to have restrictions, then phase them out as they're ready. At some point they need to be prepared to function on their own. Otherwise you might end up with 18 year olds acting like 12 year olds.


Because we know that adults who get addicted to deleterious drugs in adulthood didn't know the dangers of it, right? Sometimes availability and the wrong situation can land people in a lot of trouble. Finance is another area where despite knowing the severe drawbacks. People in the wrong circumstance may seek out loan sharks.


Not sure why you're bringing up drugs here. We're talking about webpages, not substances directly altering your neurochemistry.

But yeah, indeed - usually it's a fault of parenting. Instead of properly teaching about the dangers and responsibility (in case of weed/alcohol, or finance) they made it a tabu.


Point is, grown adults who "know better" fall into these traps too. It's not only about inculcation or knowing better. If it's there, despite all you know, sometimes you fall for it. That's the point.


That adults know better is a myth. They don't. It's not about knowing better, it's about being taught to be reasonable with the thing, and that takes long exposure to that thing, ideally with the support of a parent.


You don't get it. People trained, educated, fall for scams all the time. People in pen testing who know they will get hit also fall. There is a case for a moat or external protection.


Americans believe children are pure innocent creatures who should not be told about sex.

Hilariously churches are hotbeds of pedophilia. And the Republican party advocates child marriage.


I guess I'm missing the joke... what's hilarious about pedophilia?


"A 12yo without a phone is a social pariah."

If I give a kid that young a phone, then I'd probably want it to be a dumb phone, no camera, etc for emergencies only.


That is a valid question to ask. I have kids around that age and restricted Insta/TikTok/etc to <30 min per day. Yet from 13yo most platforms see kids as adults.


Because we’ve made the Internet so accessible and such an integral part of a children’s lives ? Even just for entertainment purposes, I see kids just staring at screens all day everywhere I go. I mean, it’s a silly question no offence.


It's quite easy to restrict access to kid's devices. We use Google family and the kids all hate us because they have to ask our permission whenever they want to install anything. We also get to see activity reports and we talk to them often about bad people online and why they need to be careful.

I understand you can't monitor a child 24/7, but you can be careful with what devices children have access to and what they can and can't do on those devices.

It's also worth remembering that the criticisms of social networks and calls for more monitoring of private communication can be applied to any crime. Do we want Facebook reporting people who it suspects could be committing a "hate-crime" for example? What about drug users, is that Facebook's responsibility too?

At the end of the day parents have to accept some responsibility to protect their own children. Yes, it's hard to do perfectly, but allowing a 12 yo to use social media then meet up with a stranger they met online seems like a total failure on the parent's part here. It's like letting a 5 year old play outside then once the kid gets hit by a car complaining that the cars are the problem because car manufactures care more about selling transportation devices than protecting kids. If you let your children chat to strangers and meet up with people they don't know then bad stuff will likely happen no matter what form of communication they're using.


Do your kids have friends? I can guarantee you that they are using their friends devices.


Which is fine because as op explained, they talked to their kids and explained the issue of people online being nefarious.

In no realm of security do we depend on slapping computers out of peoples hands and telling them not to do something. Don't do it with your kids security either.


I look at that kind of like, do you eat junk/fast food all the time at home, or just occasionally when you go out? It's going to do a lot more harm if it's constant vs a few cheat meals here or there. Same concept for limiting device time and privileges.

Also, it can help to know what restrictions the friends have and talk to their parents. At least in my day this was done to make sure there weren't any obvious physical dangers. Or has getting to know the parents of your kid's friends gone out of fashion?


Can you really talk to all of the parents of all of the kids your child goes to school with and spend time with?


Ostensibly locations like school and other organized activities have some level of direct supervision. Private situations like being at another's home is more where this come into play. That's more likely to be where a kid has access and opportunities for things like alcohol, drugs, weapons, misuse of the internet etc.


When was the last time you were in school? Drug use is endemic inside schools. It’s swept under the rug better in suburbia.

With everyone having a phone, internet access can happen everywhere.


"Drug use is endemic inside schools."

Some schools, maybe. It wasn't that long ago since I went to one. I also know teachers, resource officers, etc who currently work there. We also weren't allowed to use phones in class. If kids were caught, they would be disciplined. We also had periodic random drug screenings and search dogs.


This is really hard to read. It fits in with what we know about meta’s content moderation in other instances though (see genocide in Myanmar). If you are an employee of Meta or even just a user of their products, I would ask you to take a step back and think about whether in totality this company is worth working for.

- doesn’t care that one group of humans is using their platform to plan to kill another group (Myanmar) - doesn’t care about child sex trafficking - doesn’t care that teen girls are committing suicide.

Even if you say it isn’t metas fault, don’t they bear more responsibility then they are taking?


Yes. Imagine creating a platform that facilitates the kind of things that article recounts and instead of dedicating all possible resources to fixing the problem being like; nah, let's just pour millions upon millions of dollars into making "the metaverse".

Truly a moral void.


They didn't know what they built. And there are no real fixes.

It's your basic jurassic park story.

But the only diff is things scaled so fast, they got the whole world inside the park, before ppl started noticing how dangerous the park is.

All the fixes will be half baked because the park itself is poorly thought out.


At an even higher level, Apple and Google devices are what allow hateful and abusive content to be created. Should they bear some responsibility and prevent us from creating such content?


I think this is a fair question. I guess the way I see the difference is that a google/Apple phone is more like a tool (say a hammer) where Meta is providing a public space (like a restaurant or public square). So I think that in real life we would say that a restaurant owner has some duty to prevent their restaurant from being used to kill people/encourage suicide/do sex trafficking, but we wouldn’t ask a hammer manufacturer to police how people use hammers (we still do require some basic things like that the product must be reasonably safe). So I guess the root difference is the difference between a tool and a platform. They both bear some responsibility, but we expect the platform/space to bear much more responsibility for what happens.


How is Facebook not a platform? Should Apple police messages sent over iMessage?


What bearing does your question have on what Meta's responsibility is, exactly?


It is imperative to enforce the law and block internet platforms that fail to comply with legal regulations. The internet cannot serve as a sanctuary for promoting neo-Nazi groups and other illegal activities, as it must remain subject to legal jurisdiction. All individuals and organizations, whether online or offline, must be held accountable to the law. It is unacceptable to allow hate speech, homophobia, and the promotion of heinous crimes, such as child murder, to proliferate unchecked. The platform Telegram, for example, was rightfully blocked for refusing to provide authorities with phone numbers. It is essential that this platform and others that violate legal standards be severely punished to ensure compliance with the law.


Agreed!

While we're at it, we should also put cameras and microphones in every street corner, every public place, every household, every toilet. Everyone who refuses to do so should be considered a terrorist for refusing to cooperate with the authorities who, we all know, are infallible, only have children's interest at heart, and would never abuse their power for their own gain.

We should not stop until every inch of the Earth is covered in surveillance technology - after all, nothing to hide, nothing to fear. That's the only way we can guarantee the safety of our children. And no price is too high for the safety of our children! Anyone who says otherwise is a terrorist and a child abuser, and if that's not obvious to you, you're a terrorist and a child abuser.

Privacy is a necessary precondition to child abuse - therefore, allowing privacy is allowing child abuse.


Agreed! The government should have access to all online accounts. Bank, email, Facebook,.. and at least once a day each citizen should give mandatory report what they were thinking.


> and at least once a day each citizen should give mandatory report what they were thinking.

So just like SCRUM?


Using standups to determine who’s an upstanding citizen…


Is there nothing in between? While surveillance is indeed bad, also free pass is bad. So what should we do about the crimes already happening, as few as they might be?


Yes, there is a lot in between, that's the whole point of the hyperbole.

Internet surveillance could solve child abuse, but so could nuclear bombs. Solutions to problems should be judged by the amount of collateral damage they cause.

The original comment advocates for government surveillance of all digital communication - in my opinion, that causes a lot of collateral damage.


Did the last 5x times we did something help?

How many "save the kids" bills have we passed that haven't worked?

Repeal those and let's talk.


we are almost there: https://www.youtube.com/watch?v=JMLsHI8aV0g

nothing to hide, nothing to fear


While I knew China is a surveillance state with cameras and face rec everywhere, this is just horrid. Fuck, what a terrible place to live, I can't even imagine growing up there as a child.


Need more AI and surveillance in our lives for sure.


Absolutely! And neo-Nazis are just a start, all illegal activities in all nations should be upheld as moral imperatives. The law should be treated as inviolable religious edicts and those who violate them as blasphemers and infidels!


Excellent troll post (assuming it is one).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: