Zuck has been around long enough to have experienced goatse, rotten.com and all the other shock sites; back then, I'd argue, it was easier to find and be 'accidentally' exposed to than today.
But, that's the odd experience. Having to deal with new levels of depravity every day is different.
Anyway, what do you believe would change if Zuck did it for a day? They need moderators, as long as the technology for detecting it automatically isn't good enough yet. He knows it's a problem.
>Anyway, what do you believe would change if Zuck did it for a day?...He knows it's a problem.
Well for starters he might stop outsourcing the work through a contractor who pays its employees 1/8 what the average Facebook employee is paid. Maybe the powers that be accept and acknowledge these low/underpaid contractors are developing PTSD like symptoms and are clearly not getting services internally much less the financial compensation to get such services externally...not to mention the contractual arrangement which seems to keep the contractors from seeking help externally.
I don’t think the point OP is making is about Zuckerberg spending a day doing the work of one of these positions is to determine the fair market wage of this position in Arizona...
After all what is the local market rate for a job that has shown the tendency to trigger PTSD without sufficient benefits to treat said PTSD? You would hope at least enough to cover PTSD treatments...Would you continue to use Facebook or allow your kids to use Facebook if there was a good chance of them developing PTSD symptoms? Would you take a job where there was a good chance you would develop PTSD and the job wouldn’t cover it and didn’t pay enough for you to cover it?
Yea, but seeing someone spread his asshole in a self-pleasuring manner isn't really the same as a daily parade of conspiracies and snuff videos content moderators have to deal with
That they would gain knowledge of what the job is like, and so realise that their employees doing the work need a lot more support than they currently have.
If it were Facebook employees that might even be the case. As the article states it's conveniently outsourced so they can just label it as a dollar expense without a guilty conscience.
And the secret is that there is no current solution.
There is no automation to stop the sheer amount of dumb, sick, sad, horrifying, malignant and monstrous stuff being put up every minute from around the world.
Frankly it could probably break social media the same way cancer could kill the tobacco industry.
And this is simple moderation - ignoring newsworthiness, censorship, speech, propaganda and other issues this throws up.
The best they probably hope is that the users forget about it.
I mean seriously - what are publicly owned firms supposed to do?
Open up the graph of people who post such content. The more offensive content you post, the less privacy on your Facebook account.
Verification checkmarks allow one to screen out all content posted by unverified users, verification is available in return for a street address where FB mails a letter for a low-bandwidth 2FA. That's not so hard to fake, but also not hard to detect as fake when people deciding whether or not to filter out your content using shared blocklists.
>I would be impressed if Zuck and Sheryl decided to spend a day per year doing this job. I suspect that as a billionaire and the CXO of one of the largest companies of the world, you can get disconnected from the details.
Considering that Mark Zuckerbeg enjoys personally using a stun gun on a goat then slicing it's throat[1], and spent a year similarly personally killing any animals he ate. After his day of undercover b may enjoy the work and declare the complaints baseless.
I wouldn't be surprised. Zuck slaughtered his own meat for a year; he seems willing to expose himself to a harsh reality to remind himself that it exists.
I suspect that as a billionaire and the CXO of one of the largest companies of the world, you can get disconnected from the details.
It would be valuable for them to see a different reality of their platform.