reddit used to be great until they pulled their great bait & switch, then added all kinds of authoritarian features.
A brief history of reddit:
>We want to democratize the traditional model by giving editorial control to the people who use the site, not those who run it.
— Reddit FAQ 2005
>We've always benefited from a policy of not censoring content
— u/kn0thing 2008
>A bastion of free speech on the World Wide Web? I bet they would like it," he replies. [reddit]'s the digital form of political pamplets.
— u/kn0thing 2012
>We will tirelessly defend the right to freely share information on reddit in any way we can, even if it is offensive or discusses something that may be illegal.
— u/reddit 2012
>We stand for free speech. This means we are not going to ban distasteful subreddits. We will not ban legal content even if we find it odious or if we personally condemn it. Not because that's the law in the United States - because as many people have pointed out, privately-owned forums are under no obligation to uphold it - but because we believe in that ideal independently, and that's what we want to promote on our platform. We are clarifying that now because in the past it wasn't clear, and (to be honest) in the past we were not completely independent and there were other pressures acting on reddit. Now it's just reddit, and we serve the community, we serve the ideals of free speech, and we hope to ultimately be a universal platform for human discourse (cat pictures are a form of discourse).
— u/yishan 2012
>Neither Alexis [u/kn0thing] nor I created Reddit to be a bastion of free speech
It feels like reddit changed in line with the rest of the Internet though.
In 2022 you have the choice of a) "suppressing free speech" or b) "platforming racists" and there is no middle ground and no way to win. You can't pass the buck to moderators.
In 2022 you have the choice of a) "suppressing free speech" or b) "platforming racists"
There's a great example of the False Choice fallacy. "Free Speech" = "Racism" is the implication of your choice -- one which authoritarians would indeed embrace. The Soviets considered free speech dangerous, too; and so does the CCP.
Um, the upstream post presented a false dichotomy of either supporting free speech (as in reddit allowing people to say whatever) or racism. The parent comment to yours cited the ACLUs defence of public speech as a precedent for the importance of letting people say what they want even if you dont agree with it.
Nobody argues that reddit is constitutionally bound to host free speech. The point is that what's right in a public forum is also right for a company. You appear to be focusing on an irrelevant technicality or parroting the "they can build their own Twitter" defence, but neither of those is relevant here, the only point being made is that free (independent of constitutional obligation) speech is good, and should be broadly supported by platforms
> The point is that what's right in a public forum is also right for a company.
I don’t think we’ve established that, and I’m fairly confident that the ACLU doesn’t think so either for that matter.
If corporate speech, or individual speech had to embrace all viewpoints without restriction it would cease to be independent at all wouldn’t it? In other words, protecting the right of individuals to express the viewpoints they choose to is in fact the more democratic ideal isn’t it?
Separately, please refrain from speculation about what I or others _appear_ to be saying. If I intend to say something I have no problem doing so.
The ACLU are full throttle political partisans today, and don't give a fuck about free speech except in the same sense Stalin does: What they like should be allowed and what they don't like should be banned, and they act accordingly.
Self moderation was and still is a way out. If there were better tools to allow users to say "I don't like this person, I don't want to hear more from then" and "Bill likes this person, maybe I'll hear him out." or "I don't want to deal with this kind of thing in this room, can you talk about it elsewhere"? The tools we have for those things are not nearly as sophisticated as what exists in real life with body language, facial expressions, physical distance, symbols, demeanor, conversation timing, etc. Our tools to self moderate and separate into amicable groups which still permit crossover in real life are much better.
The major problem with most internet platforms is the boundaries and user controls are non existent. Everything is open for everyone to see and no one can control who is in their network. Only a handful of moderators and platforms have control over both visibility and membership/rules, and those controls are limited and primitive.
Feeds which magically select content users are supposed to like and the attempt to eliminate user settings to make platforms more accessible and magical contribute to this problem. Creating effective and easy to use self moderating tools is a difficult enough problem on its own; when you also have to fight a lot of entrenched design philosophy, it's even harder.
> If there were better tools to allow users to say "I don't like this person, I don't want to hear more from then"
That's not a solution in the cases when those people, are using the platform to gather a group of angry people and then will go and kill you and your family and those who look like you.
Now, I'm guessing this isn't a problem in _your_ case -- but it is, for some others (see e.g. FB and Myanmar, or WhatsApp lynch mobs).
Another problem is when the one's you don't want to hear, are spreading propaganda supporting a want-to-be-dictaor. And then eventually they succeed, because those who didn't like the dictator, just ignored them (let them spread the propaganda unhindered). -- What's the likelihood that the US is still a democracy in 30 years. (Btw I didn't downvote your comment)
> "Another problem is when the one's you don't want to hear, are spreading propaganda supporting a want-to-be-dictaor. And then eventually they succeed, because those who didn't like the dictator, just ignored them (let them spread the propaganda unhindered). -- What's the likelihood that the US is still a democracy in 30 years."
Is a two party system backed by billions, elaborate social media timelines (from experts of sociology, psychology, K street, and NGOs) even close to a democracy at this point in time?
Regards of which hand you consider the correct one, we have deeply flawed versions of the world being marketed and sold at unprecedented scale and scope.
Imagine if you worked for a company and every 4 years workers voted to keep or replace them. What if the companies main customer had a controlling vote?
I'm surprised that other comment wasn't received well. Will try to articulate myself more and see if that changes anything.
I'm not saying self moderation would eliminate bad groups. I think it would better contain them. I think having more siloed, fragmented, and naturally evolving online ecosystems would prevent any of the problems you mention from spilling over and infecting the entire culture.
I don't think any of those risks you mentioned are solved by centralized moderation. I think they're made worse, as everyone becomes incentivized to fight for control of moderation and is pulled towards ousting the other to recreate the groups those in power naturally feel most comfortable with. I firmly believe the social dilemma people are currently in is a result of the entire world trying to make rules for one big room, and no one is happy because people come from vastly different contexts that could never be properly accounted for in one room and one set of rules.
Allowing people to make their own decisions about association and information is far more scalable, democratic, and corruption resistant than trying to make platforms that appeal to all and respect all people from top down moderation. It allows people to let off steam privately and meet in designated rooms in the middle. That may seem terrifying given some of the people and groups out there. But I think recent history is a pretty good indicator that trying to forcibly educate and moderate people that are deemed a problem backfires severely. Malicious and violent groups can and should be contained and watched by others, but trying to prevent association and people deciding whats good and whats bad for themselves drives more people to opposition and affiliation with potentially violent groups than they would otherwise associate with. If people are able to express certain ideas without being automatically ousted from certain groups and affiliated with extremists, that lessens the pull to extremism.
There is no perfect solution, and I do not think that every forum should avoid top down moderation or that certain associations shouldn't be watched or potentially dealt with in the real world if dangerous enough. I just think more user directed moderation is in general the least worst option.
What's worse: a group spreading messaging encouraging mobs to go out and kill people made up of those who voluntarily choose that group over others in a world where other groups can counter message and organize and defend themselves, or a group that controls global messaging encouraging mobs to go out and kill people and bans all counter messaging?
What's worse: propaganda supporting a want to be dictator spreading in a forum of voluntarily associated zealots which bans and ignores criticism, but has no power over alternative group organization, or propaganda supporting a want to be dictator spreading on a global platform that bans and ignores criticism of all other groups?
And? Are racists really worse than other types of bigots (misogynists, intellectuals, anti-intellectuals, eugenicists, radical anarchists, religious zealots, anti-religious zealots, republicans, democrats, etc.)?
There are definitely approved forms of bigotry that are tolerated on just about every social media platform. I’ve never seen a post promoting death to unvaccinated moderated, for example. I really hate the idea that because a site has racist users that it would be considered a racist platform. That’s throwing the baby out with the bath water.
On sites like reddit, I can ignore hateful subreddits just as easily as other topics I’m not interested in. I don’t think it’s reasonable that there is an expectation that every platform police everything that might offend someone and I don’t know of any platform that attempts to. Certain bigotry is just less tolerated by admins than others.
r/conspiracy is an interesting example. Opposing views on just about any topic are accepted and there’s genuinely a lot of critical thinking and research that goes into many of the comments. Sure it’s filled with crap posts, loonies, and paranoids, but constructive, respectful discussion is had about topics considered too taboo for most other subreddits. And they are maligned by most of Reddit to the point where certain subs will ban you just for subscribing or posting to it.
Disclaimer: of course racism is awful and stupid. I’m not advocating promoting racists or misogynists or any other hate group. I just think the benefits of open, free discussion out way the drawbacks of hateful, stupid discussion.
They disrupted reddit to the core and got the ceo fired. The hijacking of “freedom of speech” in order to sustain a fascist movement was about the most bullshit thing ever and it nearly ruined reddit for good.
I'm not sure to what you're referring. I can't find reference to any reddit CEO being fired. Ellen Pao resigned, but the catalyst for her resignation wasn't related to fascism as best I can tell, it was related to firing the person most responsible for AMAs without any suitable replacement.
Just because you think the choice needs to be framed that way doesn't make it true. This reminds me of the political mailers that the Canadian conservatives used to send out: it was something like "do you support our reform of law X or do you support keeping child molesters on the street". Its a pretty low form of rhetoric.
That happens when you ban witches from everywhere. Anytime you get a platform that's not an insufferably clamped-down moderation hellhole, the cranks concentrate there and the site's just too cranky to be useful, even if cranks in small numbers could even be valuable.
Sadly, what passes for moderation at most places nowadays is even worse. It's hideous and stifling. If the cranks make places too annoying to stay at, woke moderation makes places feel pointless to participate in.
Reddit also stopped showing users the red background on their mod-removed comments. So when you're logged in it looks like your comment is live [1]. You can try it here [2].
I once tried looking through their archived code base to find when they made that change but couldn't decipher it. It's possible they never showed authors the red background at all.
yes they do! i read this website for months before posting and then all my posts would always get deleted or deleted secretly. so i realized ive just been reading propaganda this whole time. i was gutted. feels like ive been lied to the entire time. you just naturally assume youre reading what people actually think. i wish there was a way to go back to the old internet where there wasnt massive amounts of money involved in policing peoples conversations
Reddit stopped using shadowbans to ban actual users a while ago, it's only for spammers now. Of course, it still happens now and then accidentally, but it's not meant to be a punishment for real users that break sitewide rules.
I think you mean the automod setting that autodeletes your comments if your account is too new? That's something entirely different and not a sitewide ban like a shadowban would be. That's up to the moderators of the subreddit.
I'm not sure that's justification for continuing the policy. Most people are surprised to discover mods can "remove your comment and it still shows up on your side as if it wasn't removed". [1]
Reddit gave mods the ability to shadow ban about 5 years ago via automod [2]. Now even crowd control can remove comments [3]. These features would be fine, in my opinion, if users could discover when their comments were removed.
Reddit should break off all political content into its own site, porn into another, and keep the rest in place. If a post is porn or political, mods can click a button to move it to the dedicated site. This would be similar to the “move topic” functionality from phpBB and vBulletin.
This might solve a lot of problems, but it would create issues on the edge, like r/pics content that may not be overtly political but clearly has an agenda.
Personally I use Reddit exclusively for political flamewars on one account, and in-depth technical content on another account.
I’m starting to feel pretty unwelcome, though, when it comes to anything “misinformarion” related. Once you get banned from one subreddit, if you have a post reported on the same subreddit from another account, then both your accounts get banned site-wide. Combine this with the dragnet, unsolicited bans that some mods are sending to anyone who posts in certain subreddits and you have a real chilling effect.
The average IQ on Reddit is also about 110, which is just boring in most cases. When you can predict the top comments on a thread, reading them is probably a waste of time.
You're ignoring the way that subreddits like r/ChoosingBeggars are full of trivially faked texts that tick off every box to rile people up on political and social issues.
In the current polarized era, all discussion online is inherently political. The only way to escape it is in small groups with other people you have mutually agreed not to talk about it with.
This is the result of Citizens United; when elections are decided by who has the most money, it’s a zero-sum game until politics have consumed the totality of our lives.
Is HN "small"? Political discussions at HN (which are quite rare) seem obviously low-quality compared to the rest of the site, but still rank well above much of the Internet. I assume that the same would go for many niche-focused spaces, where curiosity and intellectual interest can dominate.
I actually don’t feel the discussion on any topic on HN is any higher quality than other sites with a similar age demographic. And yeah, HN is kinda small by internet standards.
Circa 2005 reddit was a forum for nerdy young tech people, talking about miscellaneous web links but mostly focused around nerdy tech stuff. Nobody I know who read reddit circa 2005 wanted it to become a white supremacist hate forum. When it started to attract large numbers of white supremacists who deposited their steaming piles of bullshit (hate speech, vicious personal attacks) all over the site, lots of people reading the site (including those who had been there since 2005) were appalled. Obviously advertisers also don’t want to associate their brands with hate speech (or child porn, etc.).
I joined Reddit around 2007, and my perspective is completely the opposite of yours. Reddit was a place where you could create your own community, and moderate it as you see fit (as long as it's legal). It was a place where people respected free speech, and that you don't have to agree with someone but you'll defend their right to see it.
White supremacy or hate wasn't really a thing on reddit until the past few years. The controversial content on reddit were r/jailbait (which I'd say a majority of the community agreed with; although many did begrudge the shifting of the overton window on 'we host anything that is legal'), but then "social justice" became a thing and a new crowd of redditors seemed to want to de-platform people.
I don't have a time machine, but I suspect if you polled a majority of redditors in 2017 whether they'd support the de-platforming that is happening today, a majority would vehemently disagree.
But my impression is that being the central hub of one of the most aggressively hateful online pro-Donald-Trump communities in 2016 attracted a larger number of abusive users and increased the visibility of the problem.
The beauty of reddit, in theory, is that you don't have to subscribe to subreddits you don't like. This worked really well, and still does, to a large extent. Of course, if you have political content on /r/videos, /r/pics etc as is the case today, you're gonna have a bad time. I wish there was more aggressive modding in non-political spaces, and less modding in the places that are.
AFAIK it’s still pro-atheism. And Ron Paul was basically the only Republican not in bed with the Halliburtons of the world, which was considered progressive in 2008.
I think a more fair characterization of reddit at that time was firmly anti-theism. I'm not religious, but I definitely remember those days from when I was still the primary demographic of Reddit (early 20s male in college, hyper-jacked into to news, IT-worker and programmer, science-focused, etc.).
Somewhere along the way, Reddit (and HN?) drifted SO left-wing that it started non-ironically referring to “libertarian” and “far-right” as interchangeable.
It never was though? Not the site. Some subs might've been, but the site overall never was. Most I've seen is having to leave some subs I used to frequent because the mods became crazy, and their madness definitely didn't lean right. Most I've seen of odious subs like that is people complaining of their existence, but they didn't contaminate the rest of reddit.
The place used to be far more liberal both in the sense of openmindedness and in the sense of people having the ability to tolerate people they didn't actively like. That atmosphere is mostly gone.
That frame of mind assumes that there’s only two camps, the Racists and Non-Racists…when there’s a third: Parties that want to sew discord for thier own interests, be it civil discord, ad revenue, or ‘user engagement’.
You Could NOT escape the stuff towards the end of the Trump administration. It wasn’t only Reddit.
It attracted a large enough number of toxic users that many parts of the site (those not aggressively moderated) were littered with personal attacks and hate speech, which was unpleasant for everyone else.
For a platform built on user contributions, they got too big (everybody ends up saying the same thing and getting downvoted as a result) and the moderation got hostile to the point you don’t WANT to contribute. Eventually you find the site that used to be fun, with lots of interesting things from interesting people…turned into a chore.
A brief history of reddit:
>We want to democratize the traditional model by giving editorial control to the people who use the site, not those who run it.
— Reddit FAQ 2005
>We've always benefited from a policy of not censoring content
— u/kn0thing 2008
>A bastion of free speech on the World Wide Web? I bet they would like it," he replies. [reddit]'s the digital form of political pamplets.
— u/kn0thing 2012
>We will tirelessly defend the right to freely share information on reddit in any way we can, even if it is offensive or discusses something that may be illegal.
— u/reddit 2012
>We stand for free speech. This means we are not going to ban distasteful subreddits. We will not ban legal content even if we find it odious or if we personally condemn it. Not because that's the law in the United States - because as many people have pointed out, privately-owned forums are under no obligation to uphold it - but because we believe in that ideal independently, and that's what we want to promote on our platform. We are clarifying that now because in the past it wasn't clear, and (to be honest) in the past we were not completely independent and there were other pressures acting on reddit. Now it's just reddit, and we serve the community, we serve the ideals of free speech, and we hope to ultimately be a universal platform for human discourse (cat pictures are a form of discourse).
— u/yishan 2012
>Neither Alexis [u/kn0thing] nor I created Reddit to be a bastion of free speech
— u/spez 2015