Hacker News new | past | comments | ask | show | jobs | submit login

I think instead of deplatform them, it's trying to force them off onto their own platform.

When racists (or similar "undesirables") share a larger platform they've kinda got to be on their best behaviour, they can't be too obvious instead they've got to be subtler and in doing so they likely attract more people to their cause because susceptible people tip toe in their direction gradually.

If they're all forced out onto their own platforms because of "muh free speech" the users won't tolerate being policed and so the absolute worst comes out and anyone stumbling across the community is going to be repelled fairly quickly.




That assumes they all only go to one place, which then turns into a cesspool that immediately repels anyone who enters.

There are two problems with this. The first is that it works both ways. If you expel all the heretics then your site becomes a cesspool of groupthink that immediately repels anyone who has had so much as a conversation with someone from the other side, because you'll be saying things that are transparently wrong to anyone who has had any real-world contact with the subject matter and be unable to correct yourself because anyone who spots the error fears being ejected for pointing it out.

And the second is that the version of the opposition on their best behavior for doing recruitment doesn't actually disappear, it just ends up in separate places. Your opponents will have their home base which is full of their own obviously wrong groupthink, but there will also continue to exist places where moderates gather.

Which means you have a new problem. The place where moderates gather will still have the subtle extremism you were trying to eject, but now, because your population hasn't been exposed to it, they're more susceptible to it. They're unvaccinated. So now you have to not only eject the extremists but also the moderates, because anyone who starts listening to the moderates may start to realize that some parts of the things your own extremists say aren't exactly true. Which puts them at high risk of switching to the other team. And you've already turned the other team into a coalition of crazy extremists. But ejecting the moderates turns your team into a coalition of crazy extremists, which is likewise quite ungood.


People like their own online bubbles in much the way they like their own offline bubbles. If you're sick of hearing the shit some people spew in the real world you don't go out of your way to be in the same places as them if you can avoid it.

We choose our social circles and the material we consume in real life it's not unsurprising this happens online as well. These broad platforms like twitter, reddit e.t.c. aren't any more immune to this than offline.


So let me give you an example of the problem. You have your filter bubbles, but then you visit an independent forum for amateur taxidermy. It has an irrelevant miscellaneous section which is well moderated enough to not be full of spam, but not by someone who really cares about or even particularly understands politics at all.

Someone on that forum posts the following statement. "The concept of white privilege is anti-Semitic because the subset of white people who are doing better than black people are disproportionally Jewish."

The factual component is true, it's not obviously spam, so the moderators leave it there. But what happens when people on the left read that?

It pits members of the same coalition against each other. If you're black you start wondering whether Jewish privilege is a term you should start employing, but you're not likely to be pleased with the response if you do, and that may leave a bad taste in your mouth. If you're Jewish you feel attacked and suddenly nervous about a popular tenet of your party's platform. If you're a non-Jewish white Democrat who has never been exposed to anything like this before, you're primed to receive some outright Nazi propaganda next.

Statements like that need to be encountered for the first time in an environment where the problems with them can be analyzed thoughtfully and without vitriol or recriminations, because otherwise, when they are encountered, they create internal conflicts and push people into the arms of the opposition.

If you ban them from your filter bubble, that is not the context in which they'll be first encountered.


This is self correcting. The mod will see the inevitable shit storm and ban anything that looks remotely like it in the future. The filter bubble of amateur taxidermists will filter it out in the future.

"Statements like that need to be encountered for the first time in an environment where the problems with them can be analyzed thoughtfully and without vitriol or recriminations, because otherwise, when they are encountered, they create internal conflicts and push people into the arms of the opposition."

This will never happen in social media. Unless in some highly highly moderated forum setup for the purpose, which is its own filter bubble and in which expertise of the participants can be ascertained.

Any forum like that essentially excludes most of the general public.


You're assuming the shit storm happens on the taxidermy forum. But most of the taxidermists aren't there to talk about that stuff, or maybe some of them are but they're not the sort to be uncivil or try to cancel the heretics, so it isn't a problem there.

But then those people bring the heresy into the rest of their lives and get thumped by the mob for crimethink when they bring it up. And then once they're declared an enemy by their own tribe they seek refuge in the opposition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: