I think the author doesn't understand that reddit mods don't work for reddit. In their discussion with the mods they mention "site-wide rules" which makes me think they don't understand that the mods of each reddit make their own rules.
On that sub, their rules don't allow people who mostly post links to their own site. It's up to you if agree with their rule, but those are the rules they have chosen.
The common response is "if you don't like the rules make your own subreddit". Unfortunately that doesn't scale very well, especially if you're trying to replace a popular subreddit.
Of course the simple workaround here is that OP shouldn't be posting their own content anyway. The should send it to a friend who can post it if they think it's interesting.
Posting your own content if you aren't already an engaged member of the community is like walking into a room and shouting "hey everyone I have something you might like!". That's why there is leniency for active members of a subreddit to post their own things.
For people who are saying "thankfully HN is different!", it really isn't. It's basically the same as a subreddit run by the mods here. I personally like the way the mods run things here, which is why I participate here more. But at the end of the day it's just a different set of humans with a different set of rules (and a little bit more control since they can do things on the server that reddit mods can't). And on this "subreddit" they allow people to post their own content if it's relevant, similar to a lot of other subreddits that aren't /r/moviedetails.
>For people who are saying "thankfully HN is different!", it really isn't.
Yes it is. HN's moderation is run by HN. The rules are applied consistently because there are are no third party mods or rules. People know what to expect and that builds community better.
You don't have to wonder if some edgelord mod is going to get home from school and delete your post based on their mood.
YC has a vested interest in keeping the HN moderation a certain way, it bring a certain quality to the discourse but it's different than the structure of any sub.
Dang has a direct relationship with the owners of the site and responsibility to them, that relationship doesn't exist in any way on reddit. The mods run amok.
Suggesting HN is effectively just another subreddit dismisses several important differences.
No, I don't think that's how HN works at all. Dan can in fact run amok on HN. I don't believe there are significant pressures from YC on how Dan structures moderation here.
People unfamiliar with the setup here tend to think about YC first when they think about its management, but they should think about Dan first. My understanding is that things are pretty hands-off; even the perks that YC companies get here are, I believe, pretty much things Dan decided to give them.
>Dan can in fact run amok on HN. I don't believe there are significant pressures from YC on how Dan structures moderation here.
I'm sorry but this is absurd. If dang had a mental break and decided only cat picture posts were allowed going forward, there would definitely be a conversation. It being hands-off so far means that YC and dang have current consensus and trust exists.
The trust between YC and Dan regarding HN is probably the whole story about moderation here. Again: people think YC exerts a bunch of influence in how HN is moderated, and I doubt it exerts much influence at all. Dan is running this place the way he'd run it if YC didn't exist.
I'm not sure why you're belaboring the point because whether it's trust or a contractual agreement to run the site a certain way the result is the same. Either way works for my argument which was that the relationship is the difference from reddit's way. I don't care how the relationship is mediated.
About the most interesting thing you can say about YC's "vested interest" is that they have a vested interest in having HN continue to be moderated at Dan's total discretion. That interest would change, obviously, if Dan was raptured. But discussing the possibility of Dan's rapturing is not interesting, curious, or productive: we're not talking about the failure modes of moderation when moderators are incapacitated, we're talking about moderatorial discretion.
>we're not talking about the failure modes of moderation when moderators are incapacitated, we're talking about moderatorial discretion.
I saw the conversation as one about the structure of moderation. Paying a mod that you meet with every week to sync up with is very different from someone making some small sub themselves and enaccting any rules that don't break the sitewide rules. Likewise, a mod has no vested interest in the site nor its policies.
That mod-admin relationship is special on reddit, in a mostly bad way since admins ignore many mod issues. Dang is more like a moderator and an admin rolled into one.
He was banned for trolling after being repeatedly warned to stop. I have no idea why you thought that was "pressure from above". But there are people who think any ban on HN is an insidious plot by YC, so it's not worth arguing.
His whole comment history and the warnings he received are right there for everybody to read. But, no, I'm sure you're right: it's all an elaborate plot, because BayesianWitch threatened to upend the YC hegemony.
Being courteous, thoughtful, and taking best-faith interpretations are key aspects of the ideology that is mandatory under this site’s guidelines. This is a wildly unpopular ideology at other sites.
If a user or mod noticed them breaking those guidelines over time, and they continued doing so after being repeatedly warned, then they absolutely would have been banned for their ideology.
Otherwise, I don’t think anyone really cares how disagreeable their viewpoint is, so long as they’re courteous and contributing while expressing it (and it’s not derogatory, hate speech, etc).
>This is a wildly unpopular ideology at other sites.
it's why I still end up browsing here even if I decide to delete multiple reddit accounts. People on reddit seem to have fallen into Poe's law of that Big Lewboski quote:
>"you're not wrong, Walter you're just an asshole!"
>"Okay then"
Another forum gives some leeway for discarding courteousness if the post is truly dangerous to heed and absolutely needs to be smacked down as an example of what not to do. But reddit presses that button for every situation. pet being outside, car doing something that may be legal in their state but illegal in the commenter's state, one-sided relationship drama, video games. It's all just flooded with insults and in-fighting over inane stuff.
I'd never heard of him, so I looked up his history. His banning causes me to reconsider participation in this forum at all. And it's sad to see the quality of those 2017 threads compared to today. So thoughtful, so many citations.
The fact of being an HN employee means that Dan has a sword of Damocles above his head the whole time. Whether or not YC are actually calling him up day-to-day and telling him to do X or Y is irrelevant; he's smart enough to figure out what X and Y are and do them pre-emptively, and he knows his livelihood depends on it.
Exactly, I'm so confused why people are speculating that dang is the only employee in the world not subject to a performance review. If someone is paying you, they have a stake in your work.
I imagine that (thoughtfully) moderating a site as large as HN is no easy task, and though I'm all for more transparency, I think logs like that have the potential to open a whole can of worms and add much more burden on Dan. All of a sudden you get a bunch of HN posts that solely exist to criticize Dan's decisions which clog up the feed. If a rule is added against this it looks like even more censorship.
Lobsters [1] has mod logs, but it also has a much smaller community, and it's invite-only so there's less need for moderation in the first place.
I don't think logs are needed in community moderated by single person, don't need logs to know who did what.
But for something like Reddit when there is more mods the anonymity basically means impunity to any judgement; nobody but mods themselves can say whether a certain bias in actions can be attributed to few mods or to everyone in the mod team.
And frankly if you can't handle a critique of your actions you shouldn't be moderating actions of others in the first place
You are still incorrectly comparing. The equivalent to HN in Reddit world is a single subreddit. When you compare a single subreddit to HN, you see there aren't many differences at all, a single set of rules being applied as the mods intend.
>a single set of rules being applied as the mods intend.
Perhaps you haven't had as much moderator interaction as I have, but this is not the case whatsoever. Some subs will have 30-100 mods and each will apply the rules differently per their own interests or pet peeves.
Well, then let's qualify HN as a mid-sized subreddit with a mostly well-behaved subscriber group and bunch of tricks (such as non-trendy UI) that reduce moderation load, making it possible for a single person to handle it.
Point being, comparing HN to Reddit is a category error; HN is in the same class as subreddits. This is less about HN and more about Reddit itself: Reddit is not a community, it's a network of communities on a common platform. Each subreddit is its own reality.
I don't know if dang is the only HN moderator, but even if he's not I've never seen any evidence of anyone in that role here just capriciously remove or lock discussions in a way that's absolutely commonplace on Reddit.
There are subs on Reddit that I simply no longer engage with because I'm sick of dimwit edgelord mods locking discussions whilst I'm typing out a contribution even though there's nothing wrong with either the original post or the discussion. There's no sense of consistency in the way the sub rules are applied in each of those subs.
A few weeks ago on the CasualUK subreddit one of the mods went off the deep end and started hurling insults left, right, and centre, and banning people simply for politely calling out their poor behaviour. That mod's conduct absolutely violated the rules of both the sub and Reddit as a whole. It's not OK and, again, it wouldn't happen here.
It would be interesting if a subreddit actually paid its frontline moderation staff, and hired them via a board that didn't have mod responsibilities itself. I wonder if any of them have tried?
Most importantly: HN is moderated by payed moderators, which are hired and selected to do a good job here. They are still humans and fallible, obviously, but it's a far cry from "random dude that started a community in his free time".
At this point Reddit mods look like unpaid employees to me. This was not the case say, 10 years ago, when Reddit admins enforced the sitewide rules and mods did their own thing. Now Reddit depends on mods to enforce the sitewide rules, instructs them in their work, removes them if their work product is not up to snuff, and I've even seen them demand minimum staffing levels be met.
I’m one of the mods of a subreddit with over 50K subscribers and all I could say is that it is very tiring. And, looking to quit being a mod.
People just can’t read or won’t read the rules. One thing I hate most is that YouTubers high jacking or spamming without even engaging the community or in the comments. It’s so frustrating to keep removing it.
Reddit is several orders of magnitude larger than it used to be. Managing a social gathering of 500 people has qualitative differences from managing a gathering of 5 people. Strategies that are beneficial at the small scale might not work or even have negative effects at the large scale.
>In their discussion with the mods they mention "site-wide rules" which makes me think they don't understand that the mods of each reddit make their own rules.
They're referring to the 10% guideline that many subreddits follow:
You should submit from a variety of sources (a general rule of thumb is that 10% or less of your posting and conversation should link to your own content), talk to people in the comments (and not just on your own links), and generally be a good member of the community.
hmm - I didn't get that read about the author at all.
In fact the whole premise of the article (comparing moderation policy differences between different subreddits) - would suggest they totally understand that reddit moderators aren't all pulled from one homogeneous pool of employees.
I think the "site wide" bit was actually brought up by the moderator - referring to the fact that the /r/moviedetails moderation checks your full post history across all of reddit (not just your behavior on /r/moviedetails) and bans you from their subreddit based on that.
Everyone wants everyone else to just watch them make the money and play video games in reality... They don't want to have to watch anyone else playing and winning at them... This is the main reason why large-scale moderated communities always become corrupted... And also why algorithms always will get corrupted too.
Don't be so sure that some of the mods don't work for Reddit. Some of them are powermods modding multiple large subs in the millions of subscribers, and they seem to be able to get their way instantly with anything they want. Pretty lucky of Reddit to have people who can mod multiple huge subs for years for "free". Not only that, but Reddit wants to do a public IPO to raise more money. So let's see how that's going to go with the investors.. "we have vAlUe FoR tHe ShArEhOlDeR by not paying staff, but these are capricious little creatures who we have no idea who they are and they have control over user engagement and the user experience but don't worry about it". How well is that going to go? Not very well. At the very least, it'll be required to be published in the financial statements how much of the Reddit budget goes to PAY mods, and this will finally put an end the oft-repeated claim that all the mods are volunteers.
I hope the supreme court takes away section 230 from these people. They have abused their power so much by permanently banning people who simply have an opinion that is not the same as theirs, that they deserve to lose it.
People are entitled to run their own private sites in ways that restrict you from sharing your opinions that are not the same as theirs. The view of the law that you're espousing is one where, simply by dint of letting people sign up to a service you run, you take on an obligation to carry their messages. That's an intense amount of authority to give the government over private enterprise, and a stultifying precedent for anyone who wants to stand up a site of their own.
Calling sites like Reddit, Twitter, Facebook, etc "private sites" is a bit of an intuition pump, even if the terminology is legally correct. You're basically evoking an image of Bob's Diner where a guy runs a restaurant and kicks out rowdy patrons. The story changes when these websites are massively popular, have majority market share in their space, have taken massive amounts of funding, etc. Any successful social network is also almost by definition a pseudo-monopology (I'm using this word loosely, not legally) simply due to the massive advantages you get from network effects. Reddit is pretty much the first place people go to start and join niche communities of likeminded people. You can say "go start your own website" but now you are competing with, well, Reddit. The basic counterargument to what you're saying is that possession of that kind of market-leader advantage should come with some level of responsibility.
> The view of the law that you're espousing is one where, simply by dint of letting people sign up to a service you run, you take on an obligation to carry their messages. That's an intense amount of authority to give the government over private enterprise, and a stultifying precedent for anyone who wants to stand up a site of their own.
Couldn't you use this formulation to say that anti-discrimination laws make it so that, simply by dint of providing a service, you take on an obligation to provide that service to everyone? That's an intense amount of authority to give the government over private enterprise, and a... you get it. We accept government intervention when we believe it benefits society. The argument being made here is that it would benefit society if these massive pipes of information that in practice everyone uses were similarly regulated in the types of discrimination they can engage in.
There's a calculus to be made about what level of control platforms having over their content would best serve the interests of society. It might be that despite all the things I listed above, the result of that calculus remains the same, but it is not immediately obvious that that is the case, as your "private sites can do what they want" formulation would have one believe.
> The story changes when these websites are massively popular, have majority market share in their space, have taken massive amounts of funding, etc.
Not really. The only substantial change is that getting thrown out of such a large and all-encompassing diner is a lot less convenient. The fundamental reality is still there: no website (maybe unless it's owned by your government, and even then) no matter how large is obligated to carry your message, and it is increasingly affordable and trivial to start your own website if other websites exercise their inherent rights of refusal to carry your message.
> Couldn't you use this formulation to say that anti-discrimination laws make it so that, simply by dint of providing a service, you take on an obligation to provide that service to everyone?
The obligation is to not make membership in a protected class or lack thereof a condition of providing a service. Bob can't throw you out of his diner on the basis of you being some race he doesn't like; he can nonetheless throw you out if you're shouting advertisements at everyone else in the diner. Same deal for a website. There is no implication there of any obligation to serve everyone: only an explication of constraints on the reasons someone can refuse to serve someone.
I know that that is the current state of the law. I’m not disputing that there is currently no legal framework to force Reddit to allow content.
I’m arguing the “ought” rather than the “is” here. I disagree with the philosophical stance that simply because Reddit is a private website, they are entitled to control the content as they want and that no level of qualitative difference between Reddit and the average website is enough to change this.
In general, we regulate private enterprise when it has negative externalities on society, and I think there is a discussion to be had about what those externalities are here. I don’t think “private companies can do whatever they want” is a sharp enough tool to engage this issue with.
>I’m arguing the “ought” rather than the “is” here. I disagree with the philosophical stance that simply because Reddit is a private website, they are entitled to control the content as they want and that no level of qualitative difference between Reddit and the average website is enough to change this.
I disagree on a philosophical level. the rules and users may be frustrating, but in the grand scheme of things, reddit isn't breaking the law nor spreading hate (well, no more hate than your average internet user. We're not talking about Infowars here). I see no reason for government interference on basis of their moderation and curation, like this thread is suggesting.
>In general, we regulate private enterprise when it has negative externalities on society, and I think there is a discussion to be had about what those externalities are here.
Sure, but I honestly can't think of any societal effects that wouldn't be felt from regular old physical analogs that is "people talking and arguing with each other".
- You can talk about groupthink, but that happens IRL and can be taken to an extreme with cults. Cults aren't illegal until they break other laws.
- You can talk about restricting artistic freedoms, but said freedoms aren't really that protected in the world at large.
- you can talk about freedom of speech and fall into the same conservative trapping as other groups. Convinently forgetting that those freedom of speech is meant to protect government from censoring your speech, not other individuals.
- there are privacy concerns which are already being addressed. That's one of the few analogs not easily transferrable to the physical world.
- Then there is the anonymity aspect of reddit, which has its share of issues caused ever since the days of "bathroom writings". There are plenty of ways to be offensive without yelling it in someone's face.
I just don't see an angle here that would justify a need to "anti-trust" the site as some public good and force all posts to remain up, nor clamp down and enforce civility in a website.
Okay, I'll take a step back and clear the table of about 50% of what you and the other replies seem to be objecting to. I don't think Reddit should be regulated by the government, or laws, or some anti-trust, anti-monopoly thing. I don't think 1A should be changed so that Reddit can be forced to allow speech.
What I am in favor of is recognizing, as a society, the value of free speech, not just as a constitutional technicality but as a principle. I believe we ought to value it as a good. And I think that as a society, we should stop and be like, hey, wait a minute, online communication is now dominated by a handful of sites, doesn't it violate the spirit of free speech if, in practice, all those sites enforce roughly the same overton window, and your options are to (a) get in line or (b) not use the 5-7 sites that everyone else uses?
In fact, I think the conversation about the constitutional limits of 1A, or governmental rights, or whatever, is a distraction. I'm not suggesting the government step in. I'm just saying we clearly have a situation where the principle of free speech is being violated — where people are not able to share certain ideas on the platforms that everyone else uses. I'm aware we don't have a law protecting that. I'm making a prescriptive argument that society ought to view this as a bad, dangerous thing.
So in terms of a practical agenda, I dunno, I would like to see a stronger shared appreciation of free speech, and bottom-up pressure for the big communication platforms to stop enforcing their own capricious filters. From this perspective — not the 1A perspective — I think the argument that "private enterprises can do whatever they want" is pretty weak and irrelevant. The actual question is what externalities it has on society when the platforms used by everyone decide what you can say on them. I think the answer to that is not obviously "nothing," and probably "something."
In that case, this hinges on whether websites having the right to moderate themselves as they see fit does indeed have negative externalities on society - and, even if so, whether those negative externalities outweigh the harm of dictating how websites moderate themselves. Perhaps we differently value the rights to speech, press, and association - all of which such a regulation fundamentally infringes.
Notice that I ain't mentioning corporations or individuals here, because that piece fundamentally does not matter; an individual could create a website used by billions, and a corporation could create a website used by a single person (not to mention that there are countless organizational structures beyond just corporations - especially once you go beyond the constraint of what's legally recognized). What matters is the size of the audience, and "you have such and such rights unless you're popular enough to have any tangible influence on society at which point the State will dictate what you're allowed to say or not say" doesn't sit well with me.
Maybe it should. There are plenty of similar laws (e.g. when you sell shares to your 2001st investor you're suddenly subject to a much stricter disclosure regime, and this is widely considered a good thing).
That law is an OTC anti-fraud measure. The Constitution has something to say about the ability of Congress to nationalize and impose its own speech controls on private websites that happen to become popular. Congress also doesn't get a say in who performs the Super Bowl Halftime Show, despite its immense audience.
If we’re going to assert the inalienable right of free speech, then arguably free speech is such an important cornerstone of society that we should think long and hard before letting any company have such influence in that sphere to begin with.
To me the big glaring central point here is that everyone uses big tech platforms. It’s just a fact of life. If you tell people, sorry, I don’t use Facebook, I use some wonky alternative instead, can you add me there? you’ll get weird looks. I think it’s insane to just ignore that elephant and continue studiously down the path of, well, they’re a private company and therefore are entitled to publish whatever they want.
When we established the precedent that private entities can say what they want, we clearly didn’t have Facebook in mind. Like there is just intuitively an obvious difference between a private journal deciding what to write about and a huge platform everyone uses deciding what to pass through or not. The state of the law doesn’t currently reflect this but that doesn’t mean the difference doesn’t exist.
Who's asserting that "inalienable right"? You don't have the inalienable right to speak on any property I own. If you walk up to my porch or into my lobby and start yelling about ivermectin, I'm going to kick you the hell out, no matter what you think your rights are.
The actual right you have is spelled out in the Constitution: that Congress shall make no law abridging your freedom of speech. That's all you get. Thankfully, the framers did not add a law granting you the right to force me to amplify your speech with my own resources.
Get a blog! The very worst people on Earth have managed to keep blogs running.
Sorry, I'll back up a bit. I thought you were arguing that Reddit has the Constitutional/inalienable right to not have its speech compelled by the government. I agree, but I think the motivation behind that right has to be considered. Freedom of speech from governmental control is important, but the reason it is important is that open discourse is an important part of society. The pipes we use to perform that discourse are arguably a sort of "commons." I would argue that, that being the case, maybe we should think hard about leaving it to the control of private enterprises.
This really seems like a case where the existing categories in the law created hundreds of years ago are no longer adequate to capture present realities, similar to how they didn't have AR-15s in mind when they wrote 2A. If you ask whether Reddit is a public or private company, sure, it's private. But if you look at what intuitions back then were around "public" and "private", and for that matter around what "speech" was, it seems like the real answer is that (in regards to this debate) Reddit is a third type of thing that doesn't have a name yet.
> Get a blog! The very worst people on Earth have managed to keep blogs running.
This might illustrate my point better. I assume you don't support people getting banned straight off the internet. Is it simply because no one owns the internet, but Reddit is owned by Reddit (the company)? Maybe it's time for that boundary to be reexamined.
You are just telling me what the current state of the law is and how society is currently run. I agree that is the world we live in. I am making a prescriptive argument about how I think we ought to do things. Maybe my argument is weak; I’ll keep trying.
My take is that the word "just" is doing a lot of work in that sentence. I'm not super interested in debating alternate constitutions; personally, I for the most part like the one we have. I might not be your best discussion partner on this.
"Private" corporations exist on the people's, via Congress', sufferance. Any entity benefitting from limited liability is not fully private and does not deserve the same rights as private individuals.
> "Private" corporations exist on the people's, via Congress', sufferance.
All aspects of a capitalist socioeconomic system exist on the people's, via Congress', sufferance. The very land on which you stand is - at the end of the day - the territory of some government (unless you live in international waters, in space, or in Bir Tawil), and said government carves off subsets of that land for its people to use as they see fit.
Fortunately, we the people here in the US have already long decided that "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances." Those rights are not conditional on whether an entity is an individual person or a limited-liability corporation or an anarcho-syndicalist commune or what have you.
Sorry, but if you have control over 5, 10, 20+ million subscribers, it's not your own private site. You're either a paid employee or you're a freeloader riding on someone's carriage, in which case they should be paying Reddit for allowing them a platform and bandwidth in which they can shape and control the narrative by controlling what speech is allowed.
It goes beyond just controlling the speech or opinions of people who disagree with you such as I am doing here, there is also money involved such as in controlling which links get published. Those links drive traffic to specific sites which host ads. Surely you do not believe that all the links to specific sites over and over is simply because they are "fast to publish" or something do you?
>Some of them are powermods modding multiple large subs in the millions of subscribers, and they seem to be able to get their way instantly with anything they want.
they don't work for reddit, it's even worse than that. They are very profitable for reddit despite not being employed so admins look the other way. That work they do may or may not be monetized in other such ways.
>I hope the supreme court takes away section 230 from these people
that's slashing your face to spite the zit. There are undoubtedly unfair moderation practives, but the implications of 230 go far beyond that.
Also, even if I agreed with 230, the issue of international boundaries will make it hard to enforce. No one's going to extradite some kid in brazil because they banned an American user on r/Art over an accusation that the American's art was AI. And that's even before going into how hard it is to win a Libel case in the US.
Interestingly enough there’s been some mod drama over on /r/AskScienceFiction today. This is a sub where people ask questions and get in universe responses. It’s very nerdy and can be fun. It also can attract a certain type of poster who can be lacking in say, self awareness and take it too seriously.
The new Harry Potter game is generating a lot of posts and a particular mod who has personal reasons to dislike JK Rowling added an auto mod post to the top of each post related to Harry Potter spoiling the game story. They’re unrepentant about it and their responses show what a lack of self awareness in moderation can do. Probably the worst case I’ve seen in a while.
If I painted it as a political agenda someone else would complain. They are personally affected because they are trans. That seemed the most honest way to phrase it.
That's something I find surprising about reddit - I can't remember what the exact percentage is, but I think it's something like 0.1% of the world population is trans... but it seems like 90% of the moderators on reddit are. What is it about reddit that attracts them?
Perhaps it's because they feel more comfortable expressing themselves in a pseudoanonymous forum. Perhaps they discovered that they were trans by virtue of being eternally online. I believe it is the latter to a great extent. I don't know if there are any studies of this (or if studying such things is even permitted), but in my own fairly small family/friends groups there are 6 people (out of around 30 or 40) who have apparently learned they are trans within the last 6 years, and to a person they are highly active on social media, chat rooms, and reddit or other internet forum sites (resetera, kiwifarms, 4chan, etc.). Prior to becoming internet addicts, not one of them expressed any behaviors or interests suggesting they were born the wrong gender, despite it being relatively acceptable in their social groups prior.
Just learned this week that an extended family member is trans. Not a single sign of gender dysphoria throughout her entire 21 years of life. 6 months of university and surgery is pending.
The social contagion is so plain to see but I guess talking about it is a taboo.
That a person of a marginalized group _hid_ aspects of themselves and lied about about themselves to prevent harassment.
Or a person goes to college for 6 months and decides to get major surgery because it is socially popular?
Just because YOU didn’t see the signs doesn’t mean they didn’t and flippantly using your interpretation of their story as proof of a “social contagion” is unfair to them.
She's an ordinary girl. Privileged in every single way: wealthy, good looks, good brain. She's never showed even the tiniest sign of being uncomfortable in her body. Not in toys, clothing, choice of friends, just zero signals at all. She does not live in an environment that is in any way conservative or oppressive, nor religious.
You don't have to take this lack of signs from me, it's from her mother whom lived with her for 20 years in a row. The change comes out of the blue, is drastic, and closely aligns with her new environment and friends. So yes, social contagion is very much on the table if not the most likely factor. There's nothing controversial about social contagion as a mechanism, most people's behaviors and mindsets are drastically influenced by culture.
You seem to take my comment as judgmental, but it's coming from empathy. It's her choice and she'll live with the consequences, not me. But when there is an inexplainable spike in young girls changing body parts and making irreversible choices, and seemingly only in big city universities, then I have questions. Because I care about them, and I will never apologize for that.
> The social contagion is so plain to see but I guess talking about it is a taboo.
This is very sad yet, regrettably, very true. It is becoming increasingly popular to promote drugging and mutilating not only adults, which is bad enough, but also children. This goes far beyond just the most obvious extreme of so-called "transgenders" but also beyond to the entire "LGBT" complex. I've personally heard, firsthand, college students express sadness about being "straight" or put various "identities" on a pedestal.
Acting like you are trans, why not, who cares. What ever feels good to you. Surgery is permanent, and we all did stupid things in the past.
6 month is pretty short for a life changing surgery where there is probably no comeback.
I have similar questions about the apparent over-representation of LGBT people among high-profile software developers, but it feels like it's never okay to ask those questions.
My own explanation is that to be good at computer science you need to have had spent a lot of time in front of a computer during your childhood. The LGBT community used to be much less (and still is in most places) accepted IRL, and as such you'd have to find your community online.
So here you have it, that's my theory, a lot of LGBTs spent a lot of time online and then got hooked on computer science stuff.
I'm gay and many of my legit LGBT friends have noticed a trend where lots of younger folks are claiming to be queer, bi, non-binary, asexual, pansexual, trans, etc., without a lot of evidence that they're anything other than a straight person looking to feel special.
Same thing on Mastodon, which I tried for a month or so. It seems every single profile you open some of the claims you mention are there, followed by disabled, and a bunch of (self-diagnosed?) mental illnesses.
It's like they're collecting pokemons. And it's not used as a list of characteristics, it's their "identity" and an accomplishment. It is center to every single thing they say, whom they associate with, etc.
One possible factor is that GSRMs are still widely persecuted and thus are not inclined to be "out of the closet" about it. That's also why people can get a bit defensive about asking questions; enough people ask such questions in bad faith that it makes it hard to identify when someone's asking in good faith.
Yep. Same with speedrunning community, autism, and trans. In the Jordan Peterson interview with Chloe Cole is where I learned that autism is correlated if not causally related to transitioning. Two hour interview warning: https://www.youtube.com/watch?v=6O3MzPeomqs
I find it funny that the same kind of people seem to attack Jordan Peterson and not it "here is why you are wrong and here is evidence" way but in "<a bunch of insults>" way.
The internet is the only place where the reality of who they really, down right to the DNA, really are. In real life, in the average case, it's not difficult to see that someone is a man dressing as a woman or vice versa, not so on the internet. When you combine this with the near-complete domination of certain political ideologies on the site it becomes much clearer.
Personal as in their private life or identity is affected, not interpersonal as in a relationship between two or more people or groups. Political would relate to government policies, or how a country is run. I’m flabbergasted that I need to explain this.
That's just plainly false. Their private life was not affected by what JK Rowling has said. Rather, their political opinion differs and a choice to be offended was made. As political protest, the mod decides the best course of action was to violate subreddit rules and spoil a game for people (instead of, you know, offering a counter position to Rowling's views).
You keep claiming it’s political when it isn’t. The mod feels personally that JKR is bad and lashes out on a website to hurt other people. They aren’t protesting the legal status of trans people. They aren’t demanding changes to health care policy. They are saying some rich celebrity is bad and fuck you if you enjoy that person’s creation.
It’s all about personal feelings and has nothing to do with politics. What role does JKRs crusade against the trans community have to do with politics? Does she hold office or set policy?
Unless you’re loosely defining politics as an umbrella term for personal beliefs which makes your initial response moot.
> What role does JKRs crusade against the trans community have to do with politics
Is she on some kind of crusade I'm unaware of ?
All I've heard from the ones complaining was not a single quote or anything actionable, and all I've seen from sources trying to just report on the situation was some spicy tweets that summed up to "biological women are not the same as trans women" and "I will wipe my tears with my royalty money" response to the twitter abuse.
> What role does JKRs crusade against the trans community have to do with politics? Does she hold office or set policy?
Her "crusade" is about who can make decisions in society, i.e. control what others do, which is what politics is. Formal political structures are not the only venue through which it happens.
If someone is trans, or cares about trans people, its entirely personal to them. The "issue" isn't always a political one.
Yes, the mod in question may be forcing their personal worldview on a subreddit, but the OC's phrasing of the overall situation seems completely accurate.
It's political because JK Rowling actively donates to enact political change for anti-trans legislation. It isn't the case that the recent hysteria is because JK Rowling dislikes trans people.
> The current process for receiving a gender recognition certificate dictates that applicants have to be “medically diagnosed as having gender dysphoria, go through a minimum two year process and be aged over 18,” The Times reports. The reform bill no longer mandates medical evidence of gender dysphoria and lowers the age requirement to 16.
I dunno, sounds like a solid reason to have issues with it, having medical professional not be involved in the process sounds like travesty ripe for abuse and mishandling
She's a feminist whose activism includes fighting for the rights of women and girls to have female-only spaces and services. This isn't an anti-trans position.
Hogwart's Legacy the game is simply part of the franchise. It does not promote any sort of political speech. People attacking it are attacking it because they have an issue with the person, not the franchise in which JKR doesn't actually exist.
My perception of JKR is that she was simply a person with a personal opinion, and then it became a holy war against her because certain individuals became enraged that they are not loved and accepted by her and the world at large. They attacked her, which made her defensive and feel more set that there is a problem here.
I am sure that if she had been left alone there would have been many opportunities for dialogue, but that can't happen now.
For what it's worth, I do understand her perspective. She grew up as a female, with a fully female body. It's hard for her to wrap her head around it that someone at age 30 or whatever age can just pop some hormones and request to be a woman. They've never had the fear of walking down the street alone at night, never had the fear of being raped or being the victim of violence, never experienced nonstop discrimination and objectification, never had an ever-present focus over reproductive biology, etc.
All this self-defeating noise attacking JKR just makes people less understanding because they only see 1 side making all the noise.
Or you could accept that people have opinions that are different to yours and books are orthogonal to her opinions anyway. It's really concerning how authoritarian some people are while ironically claiming others are the bad actors. This is what the echo chamber voting system on reddit results in.
Except they didn’t just stop there. They used their platform to actively hurt those capable of separating the art from the artist. I’ve no interest in the Harry Potter series. I’ve never read or watched any of it and I’ve a negative opinion of JKR’s hateful opinions. I don’t feel the need to punch down.
As a moderator who has agreed to apply the rules of a community for the betterment of that community to decide that those rules don’t apply to them is simply not good enough.
FWIW, promoting a bunch of your content (even if original) is seen as anti-participatory in some communities. Even HN recommends against it in the guidelines. While we're at it, at HN's scale you see the same polarization of engagement in /new. Simply posting original quality doesn't warrant engagement.
Reddit gives you a myriad of subreddit case studies to blow-up, but I think this is less of a scaling problem and more the split culture between communities.
It's not a scaling problem , it s a "reddit likes free labor" problem. They aren't going to give up on the small group of self-important idiots who named themselves mods 15 years ago and still havent quit, because they can't find new ones. No sane person will do such work for free in 2023
Trying to reason on the behavior of the mods is like arguing with the weather
It's a little bit of A and a little bit of B. I first started using reddit slightly before the great Digg exodus. Back then all the subreddits were relatively small and a lot of them were great. It's reasonable to me that someone who was passionate about a hobby like hiking or bicycling, or even video games would want to volunteer their time to nurture a community they enjoy.
Now, there are very few 'mainstream' subreddits that are a reasonable size. You have to be in to a niche of a niche to find a good community. No reasonable person would volunteer their time or get enjoyment enough to wade through anonymous accounts yelling at each other on /r/politics or /r/pics. So you end up with a bunch of people who are willing to trade their time to feel like they have control over other people, and naturally they abuse it.
reddit is a shell of its former self and having professional moderators with clear and accountable rules would be a major step forward. reddit has always shirked accountability though in the pursuit of profit, so I don't really see it happening outside of a change in the law.
We know some power-users are paid to promote things by advertising agencies and other special interest groups. We know some PACs are active in trying to shape public opinion on Reddit. I wouldn't be surprised at all if some of the moderators, especially the "powermods", are earning a living based on payments by some of these groups.
That is a popular opinion amongst redditors, and I find it a bit conspiracy-theory-ish. Shills doing anything potentially impactful are often easy to spot and tend to be called out on it or ignored. I expect there is some quid pro quo going on, but "earning a living" seems like a stretch (with maybe some very rare temporary exceptions in burgeoning frontiers).
More plausible is that there are lobbyists or employees of think tanks who use reddit as one small part of their overall efforts. There are likely a ton of failed attempts to make something go viral, with the occasional lucky success. I can't see the time investment usually being worth it for anyone actually looking at results.
Of course, for those with money to burn who are too lazy for followup, someone they employ could be making money as a 'reddit influencer', but I just don't see them accomplishing much. Throughout the history of post-agricultural civilization, there have been dead weight positions eagerly filled by those more interested in easy money than taking pride in their work. It's like a lottery for the ethically challenged, and what they do is ultimately of very little importance to the rest of us.
> Shills doing anything potentially impactful are often easy to spot and tend to be called out on it or ignored.
Not a chance. The easily spotted are the least likely to be shills, or are very new to it. If you know that people have been paid to infiltrate actual physical groups for years, in the UK literally having children with members of those groups, I'm not sure why you think it would be so much harder for a pseudonym to fool reddit visitors.
I'm more curious whether they mostly use off the shelf tools, or mostly their own tools developed in-house. We need a new post-2016 Snowden-style leak.
What’s hilarious is the idea that Reddit users are worth influencing in any way shape or form. But I suppose once you roll up the niches you end up with a big number.
In a medium where clicks and views (no matter the quality of those views) results in money down the line, it doesn't really matter. Ads want eyeballs and reddit has a lot of eyeballs to be pitched with.
The problem with reddit mods is that it's like the cops: the people least suited to the job are the ones who are most attracted to it. We end up with subs that effectively have no rules except the random whims of the mods and reddit themselves don't care at all. The comments on reddit reflect this perfectly because it's become more toxic over time, a haven for people interested in the casual abuse of others.
I'd say one difference between reddit mods and the police is that the stakes are really low on reddit.
Arguing or getting upset about mod decisions gives off strong "fight with the bouncer" vibes. Even if you're right, there's no way to "win". The best course of action is to just quietly move on.
It varies by sub, and by the purpose of the sub. Some are terrible, some are quite good.
For example, the mods of r/askhistorians are so strict that in most threads the majority of the comments are deleted. But they apply the rules consistently, and keep the sub tightly focused on questions answered by professional historians, in detail, with sources. If you have a historical question you'd like answered, it's an amazing resource.
AskHistorians is a great resource until you watch how topics in your area of expertise are answered and the kind of diversion from wide consensus that is tolerated with scant sourcing. It basically mirrors whats been happening to academia. As long as you can source a peer review on gay dogs in parks you'll be taken as credible.
I remember seeing posts there talking about how Ancient Sparta was never exceptional and actually lost most battles. As someone who has read all the primary source material on the Spartans...thats total crap.
I'd argue lately that most are terrible and a few are good. AskHistorians does that because they employ literl hundreds of mods. It's not a scalable solution.
I would totally agree. So many of them just seem to be on a power trip and moderate based on their mood. Its sad because its really killed reddit for a lot of people.
My guess is, being a mod means you see patterns and you know when something is going to descend into a shitstorm because you've seen similar comments descent into a shitstorm before.
So eventually mods kill comments not because they particularly dislike them but because they can't be bothered with dealing with the inevitable shit to come.
It's not a job. It's a hobby at best. The people who do it don't have to care about your free speech, nobody has to care about your free speech other than the government.
They treat it like a job. Regardless, I think when you tire of something that's not critical to your livelihood you should step down.
But if that happened, powermods wouldn't be a thing.
>The people who do it don't have to care about your free speech
No, but I don't have to care about the bog swamp of discourse you moderate just because "we have 1 million subs". Well great, everyone is angry and nuance is talked down. I don't want that to be the place.
But if keeping your mod power is more important than, well, moderating your community, you reap what you sow.
> Reddit mods are arguably the worst part of reddit
No, the actual administrators (the ones that work there are). They're the ones that can actually ban entire subreddits and entire users. I had my account permanently suspended for upvoting vaccine hesitancy. That wasn't done by a reddit moderator.
The de facto policy that men and white people as a group aren't covered by their anti-harassment policy also comes directly from the employees, not the mods.
I already knew where this was going. Reddit has been unusable for me as a contributor for a long time, I'd say 9/10 when I post something on an arbitrary subreddit my post tends to get deleted.
It really is infuriating, either you didn't comply to one of their many rules (and every subreddit has different rules), or you used a keyword that triggered moderation, or you posted in the "wrong" subreddit (I wish they would just repost for you in that subreddit then), etc.
At this point I've pretty much stopped posting, because I already know it'll be a nightmare to get a post approved (and even more so if you're on the app, which makes it really hard to retry posting if you get your post deleted due to forgetting about one of the subreddit rule).
Another bad thing is that, if you find that a subreddit community is toxicaly moderated (hurr r/sanfrancisco) then you don't have much choice but to leave completely. It's often way too hard to compete with an established subreddit, or to have your voice heard if you want to complain about the moderation or promote an alternative. Subreddit takeovers are truly a thing of the past due to the dilution of them all.
If I had to guess why there's such a problem, I would say: when you're moderating a subreddit you get tons of people who are here just because your subreddit was crossposted somewhere else, or suggested in their feeds. Very few people are on your subreddit because they visited the direct URL. So nobody cares about your rules, or your culture, etc. A subreddit really became like a normal category to sort posts on Reddit.
>when you're moderating a subreddit you get tons of people who are here just because your subreddit was crossposted somewhere else, or suggested in their feeds.
that's called the front page, and you're right. when a post hits the front page (reddit.com, not your specific sub) you get flooded with low quality users who don't see a sub, but a post. it's why you may see the same trend of responses across a myriad of topics, because they are coming from the front page and carry their own axes to grind.
People are focusing on the Reddit mod narrative but if you go down in the article further, there's some interesting ideas.
The author talks about Dunbar's number [0] and does a speculative approximation, saying that good faith communities are about the maximum size of the "friends-of-friends" and some of "friends-of-friends-of-friend" graph. That is, say you trust your friends to have good faith discussion (around 150 of them) and you trust your friends of friends to have good faith discussions (150^2) and maybe some portion of your friends of friends of friends (now we're at 150^2.5), then the community size is about 300k.
The data size is small and anecdotal but the idea is pretty interesting, at least to me. It at least tries to answer what the critical online community size is where good faith discussions break down, "scaling issues" start to creep in and at what point online communities change. A lot of folks here on HN talk about things like how to social media at scale (for example, Shirky's "Three Things to Accept [about online communities]" [1] has been no doubt featured on HN before). I also wonder if this ties into the "1000 true fans" idea somehow.
Whatever the community size is where scaling issues crop up, this leads to practical advice on which communities to focus on if you want to engage in a meaningful way, promote your work, or find a niche community for a business idea, say. That is, look for communities whose size is large enough to be impactful but small enough so that it hasn't gone through the scaling phase transition. This number is most likely in the range of 20k to 300k, maybe with an upper bound of 3M.
Thank you! The Reddit mod thing was really just supposed to be a catalyst for some of the ideas I had about group size limits, not really the point of the article. Though I WAS a little annoyed by the mods lol.
Edit: rereading your comment this is a very good summary of my article and maybe I should have put something like this up front to stave off the inevitable focus on Reddit mods…
>look for communities whose size is large enough to be impactful but small enough so that it hasn't gone through the scaling phase transition. This number is most likely in the range of 20k to 300k, maybe with an upper bound of 3M.
The issue is like Earth in the solar system; way too small of a "goldilocks" zone and in the days of the modern internet it seems almost impossible to hit it.
Reddit is undoubtedly too big, but look for any other competitor and the discourse can barely even hit the 10k mark, where you're browsing maybe once a week hoping someone made a post or left a comment in a post. And any modern aternative that does hit that mark (discord, tiktok, Vine back in the day) will quickly balloon past that 300k range.
Your only solace seems to be seeking older communities that have stood the test of time. Even 4chan these days seems to have cooled down outside of the larger, more infamous groups.
Would it be potentially viable to make moderators more like filterers (e.g. like filter lists in Ublock Origin)?
You could "subscribe" to a moderator if you like the way they filter content, and if you "unsubscribed" from all moderators you'd get as much of an "unfiltered" view as possible.
Bad moderators, over time, would get less and less subscriptions to their filtering of content, and would have to adapt, or be fine with less "power" over the content they moderate.
- Most people never touch which ublock filter they use, so moderators probably wouldn't care too much about maintaining a decent behavior
- People would have no idea what mod to pick and just pick the first few top ones, resulting in even more centralization, these people would have insane monetization options
- You also have the problem that if a top mod "retires", the combination of extreme centralization + a large portion of users who can't be arsed to go into the settings would mean that a fair portion of the userbase would durably get unmodded content
- I believe that reddit doesn't want their users to be able to "turn off" moderation, even just for themselves, which you would be able to do by removing all filters. Think of people posting child porn, etc.
I'd have just gone for an appeal system. Your ban message has an "appeal" button inviting you to make a post on /r/banned, once you've posted reddit notifies a certain number of recently active users at random, send them a DM telling them they've been selected for drama duty, and give them a special upvote/downvote for your post in /r/banned or something. The results determine if your ban gets overturned. With some thought put for edge cases (what if nobody gave a crap, do you get another chance, ...) but, hopefully, it's kept simple.
You track how many overturns per ban the average mod gets, single out the outliers, and allow the community of the /r/banned sub to give these mods the boot or not
Your top three points ring true (in a nutshell, people won't change their default filters), but the same could be said in general about participation in a community -- e.g. the people that struggle against the moderators in a community are usually not the ones passively viewing content, but more the power-users -- those who heavily comment, post, etc.
The power-users also (in my mind) determine the community -- if they feel restricted by not being able to post (as the parent link talks about) from heavy-handed moderation, they'd also most likely be the ones willing to change their filters.
> I believe that reddit doesn't want their users to be able to "turn off" moderation, even just for themselves, which you would be able to do by removing all filters [...]
To be fair, at the most unfiltered level, I was still imagining the content would still be "filtered" to cater towards the site-wide rules, and nothing else.
I doubt this would help, I think it would just dramatically increase moderator harassment but it wouldn't actually increase accountability more than a little because even if you know which mod is banning users doesn't mean you can remove them (other than through harassing them into submission).
I also don't see it as necessary or the fundamental issue behind a lack of moderation accountability. The problem with moderation accountability is that Reddit moderation is based strictly on seniority, there's really no effective way for users to protest bad moderation other than moving to another sub, and that's not really that realistic when the badly moderated subreddit has a better and more discoverable name. Admins will also rarely remove active moderators.
Reddit uses the actual users as the main filtering mechanism (upvotes/downvotes, spam button, report button) and this most of the time works better for most communities.
I think it's very questionable if giving moderators the power to soft-censor is a good idea, because it will cause moderators to censor posts they probably wouldn't have censored before, and moderator influence is inherently less democratic than user influence. Reddit Enhancement Suite has long implemented various filtering mechanisms for users though so users DO have some capability to censor things themselves if they have the know-how, but it's not like these mechanisms are built into say the official reddit app which is the main way reddit is overwhelmingly used.
> "...Reddit uses the actual users as the main filtering mechanism (upvotes/downvotes, spam button, report button) and this most of the time works better for most communities."
I might be alone in this, but for me upvotes/downvotes/reports have become increasingly less useful as Reddit has gotten larger and larger. It's great up until a point in communities (during which time a user-base acts in good faith), and then at some point a report becomes a "mega-dislike", people click "spam" for links that are doing "better" than what they submitted, etc etc.
I've tried to stick to smaller communities to combat this, but they also can suffer from things like "drive-by" brigades from larger communities (e.g. when a post unexpectedly reaches /r/all).
> "...Reddit Enhancement Suite has long implemented various filtering mechanisms for users though so users DO have some capability to censor things themselves if they have the know-how, but it's not like these mechanisms are built into say the official reddit app which is the main way reddit is overwhelmingly used."
RES is indeed great -- although like you said it's a pity it isn't built into the site natively.
Reddit's decade-old rule that you shouldn't submit more than 10% of your own content is one of the weirdest artifacts of that era, especially since every social media company would kill for good original content.
despite the loud complaints, many people on reddit don't really care abut original content. They want what is funny or big news or otherwise garners an extreme reaction. And when you can't find those, reposting the surefire hits works just as well.
I find Reddit mods are particularly power crazy. Most of them seem to be doing it for power rather than for love of the content of their sub. There is so much personal bias and posts removed for personal reasons, or just to flex. I’m not speculating on the reason, but it’s definitely like that.
yes, many. Subreddits are sticky and it s almost impossible for someone to start competing with the medium-to-big ones , because the 'obvious' names are taken . This is especially true for countries / cities / brand names. As long as the mods are not completely crazy, there is only one word for 'politics' or 'europe', and users will not go to a new sub just because the mods are better.
If they were individual internet forums, people would just leave instead.
Reddit acts as big single-sign-on that heavily biases users towards its existing subreddits. The fact that reddit admins still allow a small group of people to moderate hundreds of subreddits for decade+ is proof of that. If they cared, they would mandatorily recycle them
4chan? I'm not sure they do anything but remove child porn based on the fact that I've never seen it there, thankfully. Maybe Gab or communities.win? I don't use those two enough to know. Twitter post musk? But even there I've seen people complain. It's a shame because reddit used to literally advertise being a free speech platform.
>Most of them seem to be doing it for power rather than for love of the content of their sub.
In any bureaucratic organization there will be two kinds of people.
First, there will be those who are devoted to the goals of the organization.
Secondly, there will be those dedicated to the organization itself.
It is the Iron Law of Bureaucracy that the second group becomes more powerful than the first- and this is why the simplest way to explain the behavior of any bureaucratic organization is to assume that it is controlled by a cabal of its enemies. Because, by that law, it is.
I think one point the author is missing (or maybe I missed it in my reading) is that large subreddits are mostly about consumption and that's what makes them less fun as a community. People post to r/pics for the chance their image, or a comment on the image, garners huge amounts of upvotes for clout on the site. In addition to volume, another reason they don't want people to submit their links is because users don't want the competition to be corrupted by corporate marketing departments.
What is the alternative? Operating a discourse forum is high maintenance and low visibility. Facebook/Linkedin? Horrible platforms in practically all possible ways. The fediverse has some ideas about a Reddit alternative (Lemmy) but much less mature than mastodon at this point and ofcourse yet to be seen how moderation will fare if user numbers start rising.
Like so many aspects of the current tech landscape, its largely a case of TINA, there is no alternative, even though the deficiencies are glaring.
TBH, something similar to hacker news but taken to an extreme. Give actual value to an account and make detriments revoke those priveledges. You need a minimum karma here to downvote, so take that and add more priveledges. even up to being light moderator or being able to have higher weighting in votes for comments. reward good behavior and quickly remove it if the tides turn.
You need some self moderation in oder to keep a healthy community, this encourages and rewards regulars. This can also guard against "Eternal September" if a regular's vote mattered more than some newcomer.
Totally agree with the math here. If you look at Discord servers you see how they have to break down into authoritarianism as mods have less and less time to deal with each issue.
I had a similar experience. Posting some original personal blog, on-topic, long-form, and not commercial in any way, was classed immediately as spam. Concluded that closed user groups were the way to go.
Ok, hear me out: I am a mod of two very large subs. The larger one and the one were I am mostly active is HUGE. To give you an idea how large, there are anywhere between 7 and 15000 comments per day and in terms of active moderators, it's somewhere between 5 and 6(the others pop in and out for 5-6 actions a day).
First things first, beyond the reddit TOS, ultimately the moderators have the final word on what stays and what goes. And as such, some subs are moderated great, while others are a complete shithole. I.e. my country's sub is moderated by 3 far-right extremists. And I do mean that: Some months ago a police officer shot at an illegal migrant and a mod commented with "The only crime here is the cop only shot once and didn't kill him". Someone was kind enough to translate the text into English and message reddit and he got suspended. Which, as reddit goes, suspension is not permanent and he will be back. And how have they been getting away with stuff like that for 12-13 years is beyond me.
But more on the sub I moderate: With this much traffic and this much content, we need to have a very tight set of rules and we need to enforce them. Some of those rules include, no speculation, rumors or "I heard that", not to mention obvious propaganda and spam. All of which, believe it or not, are horrible on reddit and the platform gives mods little to no tools to fight it. Hence the reason why I've made several bots that work around the clock to ease our pain.
Now I have no idea how the subs in question in the article are moderated or what are their rules. But from my experience, there are users that do exactly what the mods replied with. We have several users that use the platform and the sub to self-promote their youtube channels or websites. And we have had huge arguments with them about it. One notable example is a woman who is sort of an official, who went full Karen on us. We came to an agreement that we would let her post under the condition that her posts are strictly relevant to the sub. Next day, she posts a 30 minute video, and in our chat someone posts a message saying:
"Uugh, u/<redacted for privacy> posted another video. Does someone have the patience to watch through an annoying 30+ minute video".
Now picture this happening in a sub with hundreds of daily submissions: forget about the fact that we are volunteers, even if this was our full time job, it's not humanly possible to go through all that. So if indeed, the author has been posting their own content, looking from my perspective, I can sympathies with the mods. Now is it a bit extreme to ban someone for it? Yeah, totally. Of course you should first try to reason with the author and set some boundaries. The fact of the matter is, moderators play a huge role in how active and alive a sub is. And for the sub I moderate, believe me, we go to great lengths to make sure it stays active and alive. Apart from being moderators, we constantly discuss the content and contribute ourselves.
And at that scale it is understandable if some content slips through the cracks. Which is fine for the most part. We try to make sure this doesn't happen, so the bots I made include an NLP model(which, despite being the product of a weekend worth of coding, has been doing an awesome job of giving us a list of users to be bonked). And looking at the subs in question, they are tiny. If you are a user, which goes regularly and sees 5 frequent contributor, 3 of which use reddit for self-promotion, it's likely that you will bounce off. And moderators are well aware of that.
As I said, banning seems extreme, and what we would have done is try and reason with the user first. Of course, if that fails to yield meaningful results, then...
What HN did reasonably well was to train its own users to do a bit of moderation themselves; the "bad" behaviour have good chance to just get flagged or have some user complain about your comment.
I saw that behaviour few times in some other forums and for mid-sized communities it seems to work, provided you cultivate that culture in the members
Oh no, our community has been incredibly helpful in that regard. They do report things if they see them. I was really pleasantly surprised by that to be fair. And as time goes, you lean who your most valuable contributors are and you start paying closer attention to them. Which is not to say that the automations and bots I made didn't take tons of loads off all mods on the sub.
Consider the vague line between "free speech zone" and "dictated speech zone". It isn't like the dictatorships advertise the fact. And even ostensibly "free" zones tend to have that charming "voting" system, where too much deviation from the popular view means censorship.
So you could be in the middle of a conversation, thinking it's a free exchange of opinions, when it's actually an insidious patchwork of censorship and manipulation.
I think that the concept of the moderator is fundamentally deranged. Moderators are limited, egoistic, normal people. I don't want them telling me how to speak
If the conversation needs policing, let the individual conversationalists do it for themselves.
Some kind of propagating peer rating system is something I'd like to see. Ie : I manually rate peer X. And peer Y is rated by how peer X rated peer Y, weighted by how I rated peer X, and so on.
Is there someplace where alternative forum-management-methods are experimented with?
Subreddits seem to alternate between power-tripping moderators who ban for little to no real reason (or even because you are a subscribed member of another subreddit they don't like), and completely open season subreddits with no moderation at all.
HN is so much better, because we self-police more effectively, and because dang is even tempered while still being thorough.
A waiting period before posting or commenting, and then making a few decisive and prominent examples of corrections before they get out of hand likely goes a long way to preserving the cooperation of participants at making the forum enjoyable.
I was just going on in another thread about how curation and moderation are like a cooperate/defect equillibrium or game theory, where the job is to promote cooperators in the mission of creating a quality vibe and isolating or filtering defectors from it - as opposed to measuring individual posts against a set of rules. It may be a deeper idea, as it's less about about enforcing rules and more about rewarding strategies.
There was an HN moderator post about how he indexed on the effect of someone's posts, which seems like an immensely efficient way to leverage forum participants' revealed behaviors. Downvotes are a coarse measure, but a flame war starting thread is visually obvious and detecting them scales really well because you can see and even measure sentiment in responses. Flame wars are also costly to the people in them, which makes them an honest and high value signal of the quality of the discussion a comment produces, because they have to spend time and effort to vent spleens, where downvotes are much lower information. I remember thinking it was a very elegant technique.
This strategies-and-effect driven approach is different from those Nudge Unit types because the goal is measured by producing a quality of something shared, instead of say, misleading people in service of driving a metric. (they think automatically signing you up for things you have to cancel is wise virtuous, but using lotteries that people actually willingly play are somehow unethical. Behavioral economists are just touts with airs, imo.)
We know that complex rules yield poor behaviors and reward arbitrage and defection. Simple rules reduce to being aribitrary, and really, rules generally just produce more rules. Guidelines and public examples (positive and negative) make people self moderate, which itself has shown to scale pretty well (e.g. HN).
The Dunbar reference in the article made me think that the best moderation probably doesn't scale much past that. But beyond the Dunbar number, perhaps it's more accurate to say that a forum is no longer a mere thing, but becomes a culture. The thing that doesn't scale is, a forum you can moderate with rules, a community you can moderate with standards, but when a community graduates to a sub-culture, it's likely that this higher level managing its strategies is the way to maintain its equillibrium.
My perspective on moderators is identical to my perspective on cops, politicians, and anyone else in positions of power: power corrupts. The larger the community you control, the greater the power, and the greater the corruption.
Oh, look, mods clique acting as gestapo again, how unsurprising.
I got banned on one subreddit for saying "what you said is a lie, here are 2 separate sources on the truth", cited under breaking "personal attacks" rule...
For every useful post reddit allows a ton of shit posts, humblebrags and double standards are everywhere, it's a dirty echo chamber ran by former IRC ops. They also logs your chats in clear text.
Author has no clue what they're talking about and it shows. Moderators are under absolutely no obligation to enforce Reddits anti-spam policy and I am aware of zero instances of a moderator being removed or warned for failure to remove spam.
Now moderators ARE held accountable for some things, they can't just turn a blind eye to global rules like no child pornography without being removed, and in general they're expected to keep their communities under control. If their community in particular starts harassing other communities either on or off of Reddit, or it brings Public disrepute to reddit in some way (/r/jailbait), generally the admins will get involved. However the admins don't care if you don't remove spam for your own subreddit for a very simple reason - it only affects users ON YOUR SUBREDDIT so why would they care or be involved? The commonality between all the cases I just mentioned is you're causing trouble OUTSIDE your community and causing Reddit as a whole problems, THAT'S when the admins get involved. They are like the federal government of reddit they're not supposed to care about internal affairs of states they're supposed to care about inter-state affairs and international affairs.
Another important thing to realise about reddit is most of the moderation isn't done by moderators or admins, there are way too few of them to moderate effectively. The author is correct about one thing - were those Reddit's only moderation tools even with automation scaling moderation would be impossible, but they're not the only moderation tools Reddit has. Most moderation is done through the influence of users soft-censoring posts through downvoting and reporting posts to draw the attention to the mods. Something the author didn't even write about or seem to think about is the fact the mod probably only looked at his post BECAUSE a user reported it.
In general though, Reddit at a global admin level has long blocked users for self-promotion and they even have a writeup about it. I have seen countless users, even ones generally well received by communities banned for self-promotion, even ones whose posts I kept promoting after the original posters account got banned because I liked their posts so much (and I know how to not get banned for spamming on Reddit), because one mans useful post is another man's spam and to be fair you have to just ban all self-promoters. Reddit has writeup about this: https://www.reddit.com/wiki/selfpromotion/
FTA: "But my article is not the type of spam these rules are meant to exclude"
FT rules: "We're not making a judgement on your quality, just your behavior on reddit." "10% or less of your posting and conversation should link to your own content"
So I would say if anything, the scalability problem Reddit has is that authors like the OP can't read the fucking rules, break rules, get banned, and cry about it while citing a misinformed view of how Reddit moderation works which isn't surprising given he didn't even bother to read the rules before publishing this crap.
P.S. The way you can self promote while staying within the rules is to sincerely become an active member of each community after which a small amount of posts can be part of your own content. This also deters cross posting your own content across several subs at the same time which tends to annoy people who see your content repeatedly. If you do this, you can still end up in hot water because people reflexively hate self-promotion, but you're a lot lot less likely to get banned. The thing is that people with no investment in the communities they're posting their content into will not do this, which is precisely the point, Reddit wants the regulars of that community to post content not drive-by self promoters. This overall ensures quality in a fair way. It doesn't work that well, Reddit has been astroturfed for eons, but it works somewhat.
The rules of reddit apply globally in communities large and small, a moderator might like you and not ban you and maybe users on a certain community like you and don't report you, but if you break the rules of reddit yeah it's fair for you to get banned at any time by anybody with the power to do so with them citing that rule.
I'm a Reddit moderator of a medium-sized sub. We're talking more than 100k, but less than 300k.
The amount of bullshit we need to filter out is tremendous. There's constant spam every day, and it's hard to catch all of it. We allow self promotions of original content, but only once per account and per topic, otherwise the whole sub turns into a spammy, boring mess. Automoderator helps, but Reddit does not. All of our content is in a different language and Reddit either ignores most spam or overreacts on a report and bans a user who was using local lingo to perform a joke. There is no in-between. The seniority of anyone's account doesn't mean a thing to Reddit. A simple botched GoogleTranslate-like action on a comment can get you banned fully from the platform, even as a moderator.
Do we delete discussions? Yes, yes we do, when the situation calls for it. You don't get to cause a ruckus or ad-hoc insult people. You also don't get to spread rumors, obvious disinformation and pseudo-science. Save it for FB. You also don't get to threaten anyone or claim person X is of lower value due to their nationality, creed or religion. If there is too much content related to one topic, we'll delete your post and ask you to join the already submitted one. If we sense that your tone is causing a flame war, we'll nuke the whole thread. It's called moderation, not free speech. Banning is a last resort. If you feel we've wronged you, you can always send us a polite message explaining what's up. If you feel we suck, there's always the option to create a different subreddit.
Why do I do this? I normally wouldn't, but this subreddit is very important to me because it's one of the last few places left to freely discuss about the politics and daily life of a certain geographical region we live in. Homegrown forums are no longer easily visible on Google, cost money to host and have a worse interface. Facebook groups can't fill the void because people's full names are visible. So, in spite of the government crackdown on dissenting opinions, we somehow thrive under anonymity. Do people curse us? Hell yes, some do. In fact, people from HN on here are somehow managing to act holier than thou and paint us as power hungry basement dwellers. Fun.
I'll let you know, I would rather not deal with your shit and would rather do something else, but for now it's my moral duty to stick to what I'm doing and put in all the effort I can to ensure a safe, free zone for discussion.
But at some point, most moderators will leave. We can get banned just as easily as our users, there is little to no protection awarded to us, people hate us for deleting their content and then they try and dox or attack us. And now you get what you asked for: fully automated moderation. A poorly made translation layer pushing content to an ML trained dataset indicating whether something is hate speech or a threat, overseen by underpaid 3rd world workers who will eject you from the whole platform at their leisure.
It's easy to understand, it'll take hours of your day on a big sub, many subs regularly advertise for mods, but most people aren't interested in the extra work.
I think you have problems understanding what is written. I'm not asking why they can't catch up. I don't understand why an unqualified group of people is empowered.
On that sub, their rules don't allow people who mostly post links to their own site. It's up to you if agree with their rule, but those are the rules they have chosen.
The common response is "if you don't like the rules make your own subreddit". Unfortunately that doesn't scale very well, especially if you're trying to replace a popular subreddit.
Of course the simple workaround here is that OP shouldn't be posting their own content anyway. The should send it to a friend who can post it if they think it's interesting.
Posting your own content if you aren't already an engaged member of the community is like walking into a room and shouting "hey everyone I have something you might like!". That's why there is leniency for active members of a subreddit to post their own things.
For people who are saying "thankfully HN is different!", it really isn't. It's basically the same as a subreddit run by the mods here. I personally like the way the mods run things here, which is why I participate here more. But at the end of the day it's just a different set of humans with a different set of rules (and a little bit more control since they can do things on the server that reddit mods can't). And on this "subreddit" they allow people to post their own content if it's relevant, similar to a lot of other subreddits that aren't /r/moviedetails.