Hacker News new | past | comments | ask | show | jobs | submit login

Think about it for a second.

Where else can an employee browse filth all day in a secured environment? Free from the eyes of children who may be doing pandemic schooling from home? And indemnified against criminal liability for opening up certain troubling related content by an authoritative, audit ready, third party recorded video feed?

There are just legal requirements around certain types of porn for example. Or videos of illegal activities up to and sometimes even including murder.

Law enforcement authorities only tolerate a certain number of mistakes in this regard. With certain content, they may not tolerate any mistakes. I'm not saying FB is right. In fact, FB probably didn't even make the decision. They outsource. What I am saying is that FB, with the content they want their contractors to moderate, are putting those third parties in highly precarious legal positions. I don't blame those companies for acting to protect themselves if FB is not willing to take on all legal responsibility for things that may go wrong.




>There are just legal requirements around certain types of porn for example. Or videos of illegal activities up to and sometimes even including murder.

If I understand your point, it is law enforcement doesn't allow moderation of these items outside of a certain environment? If so, can you suggest further reading of the laws for this area? I have never heard of them.


My point is not that law enforcement won't allow moderation outside certain environments. It's that law enforcement specifically forbids possession of, or even access to, these materials by anyone. FB likely has some kind of working agreement with law enforcement which allows them to do the business of moderating and reporting. Does that agreement extend to allowing FB contractors to access child porn from home? I'm in no position to say since I'm not privy to the details of the arrangement.

I'm saying as a contractor, as the low man on the totem pole, there is no way I would want any of that content touching my home network without an iron-clad assurance from every level of law enforcement that I would not be prosecuted. An assurance from your local prosecutor probably means nothing to the guys at the US Attorney's office. (And sometimes even vice-versa.) Accessing those materials in a secure environment that is audited and recorded, in direct partnership with every level of law enforcement, avoids issues of that content touching your router altogether.

I can't imagine that any person who understands criminal liability would actually want to do this kind of work from their own home networks.


I remember some Microsoft people sued due to PTSD:

https://thenextweb.com/microsoft/2017/01/12/microsoft-sued-b...

I'm not sure exactly how this works in tech. I know lawyers who have worked on child abuse cases and they had tight restrictions on the evidence. Only the trial lawyers could request access. No paralegals, no sectaries, no copies. Only select Lawyers, the jury and the judge could view the evidence, and it was all controlled by the Justice Department. (I knew a lawyer who refused to look at the evidence, only the descriptions, because he didn't think he could handle defending him otherwise. He did successfully defend him though).

I suspect for Microsoft, they have some preliminary hashing algos that automatically send known media to a particular people with the same Justice Dept exceptions and controls for chain of custody. Other people may see content of course, but it would likely get flagged and shipped to people who were authorized to deal with it. You don't want a lot of people on that list obviously, but when it's just a few people, they get a constant stream of nightmare fuel.


This is a system called PhotoDNA which was developed by MS and is now used by all of the major US tech companies.


I imagine they don't even want the fingerprints "getting out". Imagine if you had the fingerprints and the algorithm. This can't possibly be a cryptographic hash (too fragile); it's got to be possible to find "collisions" that have no relationship to the original image, and probably don't even look like images of anything at all.

Imagine the chaos someone could cause.

It'd be like back in 2010ish when somebody encoded a bunch of malware into bitcoin transactions: every bitcoin node running any kind of antivirus software was immediately kicked offline. Many AV programs deleted their owners' copy of bitcoind. It was madness. Forced the bitcoin developers to store the blockchain on-disk xored with a different random number unique to each computer.


I think it is simpler than that even. Imagine the scandal if some content moderator working from home copied over and saved child porn he found during moderation, and perhaps even redistributed it. Doesn’t even have to be child porn - could just be e.g. sexting messages between adults.


Just imagine the fallout for Facebook if a contractor slips up and opens up a laptop on the train or at a coffee shop. It doesn’t have to be malicious, just negligent. And as recent scandals at Amazon and Twitter have shown, employees are as vulnerable to bribery and social engineering as any other human. Even if Facebook is legally covered, the PR risk is too great.


If this were a legal requirement, the actual facebook employees would have been brought back.

They haven't been, just the people contracted by CPL/Accenture to work at FB, which leads me to believe that this is not a legal requirement.

It's presumably that FB said you need to ensure security to the companies, and they decided that the easiest way was to bring everyone back to the office.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: