Hacker News new | past | comments | ask | show | jobs | submit login

Hacker News is a well-moderated community, but it's illustrative to see where Hacker News fails at moderation. While Hacker News is great at protecting the community from disruptive individuals, it tends to fall down when protecting unpopular individuals against the community turned mob.

I support Hacker News moderating itself however it chooses. However, if we are looking at it as a moderation model for large, open, non-editorial platforms (Youtube, Facebook) -- which I believe should all be covered under public accommodation law -- it clearly fails. And even if when we are looking at ostensibly neutral, publicly-orientated sites like newspaper comment boards, it fails.

Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights.

So while this moderation method succeeds for Hacker News, and perhaps should become the model for small private sites, we should not try to scale it internet-size companies. Platform companies (Google, Facebook, Twitter) and backbone companies (ISPs, Cloudflare!) need a different set of rules geared towards protecting individual rights and freedoms instead of protecting a community.




I agree with everything you say. But… This site is moderated by two people and has 5 million guests. When you look at it from that perspective, I think the only conclusion you can come to is that they are doing a remarkably good job.

I say this as a person who feels that his political opinions have been treated with disrespect by the site’s community and its moderators. I fervently believe that the moderators are doing their best to be impartial. And I also see people on the opposite side of the political spectrum from me who have the exact same complaints. When I look at it that way, I realize I literally can’t ask for more from the mods.

As far as your making the distinction with platform companies and back bone companies, I think you have it completely right. I detest racist shit on the Internet, but ultimately no good can come from driving it underground. I think it’s much better for us to be able to see and come in time. Plus, my political affiliation, which is the same as Abraham Lincoln‘s, is considered by many of the powers that be to be inherently racist.

When the platform companies start trying to decide who’s wrong and who’s right, they are forced to use extra-constitutional means. Not good.


> Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights.

As far as I'm aware, dang and sctb are fairly reasonable if you email them, and you can see what they're doing by just clicking on their account history.


> Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights.

I'm not sure I'd agree with this. An appeal is as easy as sending them an email. In my experience they're more than willing to hear you out.


In my experience they aren't.


What was your experience? I haven't been able to find where we interacted with you.


Of course not. I'm not insane and so I didn't continue using a shadow banned account.


You said we weren't willing to hear you out, but that seems unlikely to me. People sometimes come to HN with stories of how badly we've mistreated them, but rarely provide links or enough information to let readers make up their own minds. Mostly these stories leave out important details about how the account had behaved and how we interacted with them. But we do make mistakes—moderation is guesswork, and we guess wrong sometimes. If there's a chance of that, I'd like to know what we did so we can correct it.

We only use shadowbanning when accounts are new and show evidence of spamming or trolling, or unless there's evidence that the user has been serially creating accounts to abuse HN. It's possible we got it wrong in your case, but again, we can't correct mistakes if people won't tell us about them.


> We don't shadow-ban accounts unless they're new and show evidence of spamming or trolling, or unless there's evidence that the user has been serially creating accounts to abuse HN.

That's a complete contradiction of the explanation you gave at the time. And yes, I asked what had happened when I noticed the account was shadowbanned, and your response was

----------

Hacker News <hn@ycombinator.com> Aug 31, 2018, 10:43 PM to me

Politlcal/ideological flaming; unsubstantive comments; addressing others aggressively. Example: https://news.ycombinator.com/item?id=17358383. That's unacceptable and bannable in its own right—and you did a lot of other things along those lines.

You can turn this around by doing the opposite: (1) become less inflammatory, not more, when posting about a divisive subject; (2) make sure your comments are thoughtful; (3) be extra respectful.

Daniel

-------

(I especially liked that the example comment was from three months earlier. Why didn't I immediately think that far back??@!)

Of course, despite my multiple followup questions, you never bothered to reply again. I'm sure that now you will find the motivation to give an extensive public explanation complete with links of exactly how you really meant that I was "new and spamming or trolling" or "serially creating accounts to abuse HN". It would never work to, say, reply to comments that were 'unacceptable and bannable in it's own right' to say that. Or to follow your previous public explanations of moderation policy, such as

>When we’re banning an established account, though, we post a comment saying so. https://drewdevault.com/2017/09/13/Analyzing-HN.html


I've taken a look, and this was a case of what I said: we shadowban accounts when they're new and show evidence of spamming or trolling. In your case, the account was new and had repeatedly broken the site guidelines (using HN for political battle and being aggressive to other users). That's evidence of trolling, which is one of the situations where we shadowban. I mentioned https://news.ycombinator.com/item?id=17358383 by email because that post was what tipped the scales.

I can see why you are angry that we shadowbanned you, because that account went on to make other comments that were fine for HN. Indeed, quite a few were vouched for by other users. That's evidence of not trolling, and that sort of account is not the kind we shadowban—it's the kind where we post moderation replies, and tell people if we ban them. But those later comments didn't exist yet when I shadowbanned you.

Here's the information you need if you want to understand why we do things this way. HN gets tons of new accounts that break the site guidelines and in fact are created for that purpose, which is what I mean by trolling. We can't reply to them all, ask them to follow the guidelines and patiently explain where they're going wrong. If we tried, we'd do nothing else all day—or rather would go mad before getting there. Many of these users know perfectly well what the site guidelines are and have no desire to use HN as intended. If we poured moderation resources there, not only would it not work, it would make things worse, and meanwhile those resources would be unavailable for the rest of the site. For accounts like that, we use shadowbanning, and for the most part that approach works well. But it doesn't work in every case.

When a user emails us about such an account, we have to guess whether they're asking questions in good faith and really want to use the site as intended, or whether there's little hope of convincing them to do so. We don't always guess right. It looks like I guessed wrong in your case. The thing is, though, that when I sent you that detailed explanation of what was wrong with your comments and why we'd banned you, you didn't respond with any indication that you'd received the information and wanted to do something with it. Instead you responded aggressively. I get that you were angry that you had been shadowbanned and didn't know it. But that type of response is correlated with users who go on to be abusers of the site and are not people we can convince to do otherwise, no matter how many replies we give them.

All of this is pattern matching and guesswork. Your original comments and your emails matched patterns that are associated with abuse of the site. With hindsight I see that the pattern matching got it wrong, because your new account has gone on to be (mostly) an ok contributor to HN. (I say 'mostly' because, looking through its history, I still see unsubstantive comments and occasionally worse, but not bannably worse.) But I don't see what I could have done differently in any way that would scale. Our resources are meagre; we're constantly in triage. Patient explanation takes a lot of time and energy—it has taken me an hour to write this so far, and there are many more users demanding explanations than I have hours. Had your emails indicated openness to information or willingness to change, I probably would have replied further. But there are many users who fire multiple angry emails on each reply they get, and we've learned that they are not a good investment, when hundreds of other things and people are clamoring for attention and explanation.

Actually, I do think there is one thing we can do differently that is helpful in such situations: get better at handling anger. There are many users whose every interaction with us is angry and only angry. Often it feels like the intensity of their anger exceeds any of the provocations they're complaining about on HN, even if they're correct on those details. It's as if they're really angry about something else—something more important—but they turn that energy instead onto the extraneous outlet of HN and its moderators, maybe just because it's less important and so in a way safer. I find it difficult to be on the receiving end of this anger. At any moment, there are multiple people doing it. They don't know about each other, so they experience our interactions as individual and demand individual attention, while we experience it as a constant bombardment. It's possible to grow in capacity to handle this—it just requires a lot of personal work. You emailed us a year ago, and I've probably gotten better at this in the last year, so maybe the pattern-matching works a bit better now.


Oh cry me a river. "I couldn't be stuffed making more than a token effort, and here's all the excuses for why I shouldn't have to. We had no way of knowing anything about you! There were only five comments on that account, and it had existed for such a short time, that I couldn't even read the very first couple of comments that said this is the continuation of an existing account! All trolls look like that!"

And wow, give the pop psychology a rest. "I just don't understand why condescendingly shitting on people makes them angry so I've decided they are probably taking out their childhood issues on me." Yea, it couldn't possibly be actually aimed at you.


If you'd just banned me at the time I probably wouldn't be so angry still. The experience of discovering that you had been invisibly fucking with me, probably for months, was absolutely infuriating, and seeing you flat out deny that you pull that shit as a way to manage actual users is unacceptable.


I completely agree, although I like moderation here since it is actually very accepting and I might just have another definition for it than the author of the article.

Another example where moderation works well is stackexchange. I think a Q&A sites needs strict moderation to be able to stay on topic.

I think what Google is doing right now is reprehensible for example. Ghosting content because it is controversial and therefore coincidentally bad for advertising leaves a very bad taste.


> Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights.

What you are asking for would take significantly more resources. I appreciate what you are saying, but all I ask is that a site be consistent. If it is consistently moderated, I as a user can vote with my feet (or my clicks in this case) if I approve or disapprove of the job that is being done.


Yes, I would only recommend it for the monopoly platform companies (Google/Facebook/Twitter/etc.) and backbone companies (ISPs, Cloudflare!, etc.)


Very much this. I have been rightfully corrected by dang before, and it made me reflect and adjust my general approach to commenting online. I have also been very wrongfully dealt with a few times too though, and it made me very hesitant to comment further. I am thankful for a moderated space (there are other areas for unmoderated space, but those are under attack these days) and they do a good job most of the time, which is good enough for me.

The backbone companies thing is the more important point you make imho, and is why discussion about what the modern public square really is and how the level of corporate dominance of media and the abuse of third-party doctrine allows the silencing of dissent. So a private company can censor whoever they want right?.. but how far does that go? First it starts with cloudflare, and then eventually it goes down to the ISP level... and that seems like a very dangerous slippery slope to me.


>Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights.

Yep. After eight years on the site I finally had a comment removed, and my response to dang to discuss it was completely ignored. Made me feel dumb for even trying.


Where did you request that?


In response to your removal message.


Ah, you must be talking about https://news.ycombinator.com/item?id=20258509. I don't remember that post, and it's possible I didn't see it. I try to read all the replies, but miss a few, especially when they are posted later. If you want to be sure we see something, the only way is to email hn@ycombinator.com.

It's also possible that I saw it and was too exhausted to look back through all the flagged comments in the thread, identify which ones were explaining how AMP works, and see if they had been flagged correctly. That takes a ton of energy, which is not always available when the rest of the site is clamoring for attention and in varying degrees of onfireness. One thing you can do to increase the odds of getting a specific response is to include specific links to the post(s) you're worried about—that makes it an order of magnitude easier. I certainly appreciate your intention to defend fellow users who are being mistreated.

But I think if I had seen your comment I would have replied at least to say that I believe you that your intention wasn't to downvote-bait.


Indeed that's the post I meant. Thank you for reviewing it today, and giving your thoughts.


You're welcome to email hn@ycombinator.com if you'd like to discuss it further.


"Those who do not move do not notice their chains."

In other words: People who think HN moderation is all fine and dandy only believe so because they've never had the audacity to post an unpopular fact or opinion.


> While Hacker News is great at protecting the community from disruptive individuals, it tends to fall down when protecting unpopular individuals against the community turned mob.

Isn't every community -barring a few wild west's- like that? One Usenet, someone with an unpopular opinion would end up in people's kill file. On IRC, someone with an unpopular opinion would end up on people's ignore or would get kicked off the channel or klined off the server. Except for the kick and kline the effect of a kill file or ignore is akin to a global shadow ban.

The solution to the problem you mentioned is that unpopular opinions with merit will find their way to become eventually popular enough that they're adopted. Whereas unpopular opinions without merit eventually end up existing on the cesspits of the Internet. Because somewhere on the Internet, any person can spout their unpopular opinion. The question is, who reads it? Is it so much different from a shadow ban?


I don't mind this sort of moderation on community sites like Hacker News.

Still, as the error keeps coming up, I will repeat myself (I apologize to people trying to read the whole thread, but this is the main point): For the big platforms (Youtube, Facebook, Twitter, Reddit, etc.) and backbone providers (ISPs, Cloudfront), we need the right to appeal, public auditing, bright line rules, and due process rights.


What about market competition? Won't that work out for the reasonable content? Won't a platform like Tor always allow all content (reasonable and unreasonable) to at least stay available as it is censorship-resistant?


Individuals putting someone on their personal ignore list is different from a moderator deciding to put someone on everybody's ignore list though.

Your solution is a bit lacking, I believe. "If it's not popular, it has no merit, because if it had merit it would be popular". There's also no telling whether some unpopular opinion you'd consider without merit today will become popular tomorrow (and, if history teaches us anything, we've been generally bad at predicting which will become popular, which will remain popular and which will fall out of favor), so it's somewhat silly to make hard judgements.


Of course there are going to be posts which don't get upvoted enough. Everyone who frequently posts here has witnessed that some good posts don't get many upvotes, whilst other ones which you find less good do get them. The amount of upvotes does not tell us how good your post is; all it does is determine for the reader the order of the posts as it isn't chronological. Why? Because nobody has come up with a better solution.

My reasoning (solution) is a rule of thumb, not a law. Every group of people has a blind spot. There is no perfect solution; however I believe the solution as presented is the one with the least casualties. If you know a better one, I'm all ears.


I don't like about the popularity vote-moderation that it tends to become a dictatorship of the majority: if you're outside of the accepted canon, you'll get thrown out. It's effective for creating an a fairly consistent echo chamber, but I don't believe it comes with few casualties. You won't see the casualties, because they are censored.

I'm fine with using a system like that for starters but ultimately let benevolent dictators overrule majority decisions. My biggest problem with your previous post was the implication that whatever isn't popular doesn't have merit. The relation between the two is weak, that's what I wanted to express.


> I don't like about the popularity vote-moderation that it tends to become a dictatorship of the majority: if you're outside of the accepted canon, you'll get thrown out. It's effective for creating an a fairly consistent echo chamber, but I don't believe it comes with few casualties. You won't see the casualties, because they are censored.

First of all, I'm someone who at times reads a lot of the discussions, and I see each and every casualty by default because I browse with showdead on. I also see it as part of being a good netizen to help with moderation (and doing vouch or flag is part of that), especially for non-commercial endeavors (which, this place, arguably is or is not depending on your viewpoint; for me it is more akin to .org as its not the purely commercial wing of YC). Heck, I even sometimes check out shadowbanned user's posts. I'm weird like that.

Censorship, in my eyes, can only be enacted by a government. There must be some less powerful word which fits the bill.

Every community [website] has its echo chamber because every group of people contains such.

Now that I put what you wrote into a -IMO- more accurate context which I felt was necessary, I'm left to ask you: What is your proposed alternative?

> I'm fine with using a system like that for starters but ultimately let benevolent dictators overrule majority decisions.

We got 2 who can.

I've been on websites where less intelligent people become benevolent dictator. People who don't see their blind spot. More moderators isn't necessarily better. Also, ask yourself: was a website with a lot of moderators such as K5 or Slashdot or Digg or Reddit necessarily better? Are websites with no moderation whatsoever better?

I actually have quite a bunch of "radical" viewpoints myself; and I do not feel like I cannot express myself here. Yes, at times I get upvoted or downvoted where I feel surprised. Both ways. In case of the latter I always try to reflect what I could've done better. In fact, in every conflict I have I try to reflect what my part in the conflict is.


> Censorship, in my eyes, can only be enacted by a government.

Why do you say that? I would think that it could be enacted by any group powerful enough to suppress speech in some way.

It was the Church that made the Index librorum prohibitorum. And that was in an era where it was especially week compared to the absolute governments of the time.


Good example. I read up on it, and on reflection my view on the common definition of censorship was too narrow. Not sure about the legal one, still, but it'd differ per jurisdiction. I guess the reasoning was that, in the end, only a government can enforce it.


The problem is that dang decides for the whole community, which is predatory towards minority opinions, and we're not talking about being predatory towards Nazis, more like people who express their positive opinions about Linux desktop when folks generally agree it's not great (or vice versa).

Dang doesn't allow dissent in the comments, he actively tries to reduce it, and it's a net negative for this community.


" it tends to fall down when protecting unpopular individuals against the community turned mob."

I have experience with that given I got my early karma countering many popular people here with downvote mobs. I had no idea I was doing that, either. Never seen their names before. Just saw some mis-information that I'd correct. Dan hints in the article exactly what worked for me: just delivering the information in a logical way with sources that invited people, often engineers, to work out what was true for themselves. The folks with mobs used rhetoric and argument from authority. I used evidence. Eventually, most of the grey comments went dark again with some folks in the mob going grey. They mobbed less often, too. I was also told about a pattern where some hot-heads read threads earlier with some calmer folks (maybe older or more experienced) coming in later. Happened with most of mine, too.

So, you can't stop the fact that they'll show up. You can diminish their power by simply remaining civil and informative with evidence backing up your claims. Also, I try to use links that maintain the same qualities. If it's a political topic, it won't help to link to a site that's 100% biased in a specific direction. Biased or rhetorical sources likewise get dismissed. Fortunately, most of my arguments were technical. Many good sources for that.

I'll also add that it helps to remember the mobs on these sites represent cliques on these sites, not people in general. Most people I've met aren't much like folks on HN, esp the aggressive ones. It helps to remind myself that what's going on in these online forums might just be representative of their culture, attachments, traditions, etc. I don't internalize it. Still introspect about it since there are many times where I can learn something or improve myself. Doing this is a tough skill to develop. I think it's a necessity on the Internet given how much negativity is there, even waves of it at once. It will still tax you but a lot less than before.

"Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights."

We had more of that on Lobsters. That was part of the founder's experiment with the site. It was initially really different. Mostly due to our vetting process we did for invites with strict controls on quality and private messages to people. Eventually, did a mass invite bringing in all kinds of people. Many of them aren't doing vetting so much as just telling people about a site they like. The result is the site is now more like Hacker News than it was.

There's differences for sure. I just think a lot of the problems are inherent to bringing in a lot of people from many different places and perspectives onto a tech site with open-ended discussions. They covered a lot of this in the original article, though. I won't repeat it. I just think more rules, even more accountability, won't change it.

If you want the latter, turn on showdead to find most of what they moderate away is garbage. There's some filter bubble on specific political topics that aren't popular here. They're tiny percent of the comments, though. Based on volume, I'd say moderation here is pretty light-handed in general. I mean, if you look at New repeatedly over a day, you'd question how the heck some control-freak moderators could even keep up with it at all. I stopped looking at New more than once a day since I didn't have the time for it.


Which "public accommodation law" do you refer to?


In the US, that would be title II of the 1964 civil rights act, taking the broad view of it that has developed in case law over the years. I would also support new FCC net neutrality style regulations or even new acts of Congress that strengthened this.


Any changes to include internet services under the Civil Rights act would likely run into a First Amendment challenge fairly quickly, and likely lose. Net neutrality was probably constitutional as long as it applied to physical carriers using public land (for wires) or regulated spectrum. A more likely path would be through Section 508 (the accessibility act). Even that is unlikely with the current partisan makeup of Congress and the US Supreme Court.


I agree that there would be a better chance for this line of reasoning if Manhattan Community Access had gone the other way in the latest SCOTUS term. But it was 5-4, and the decision does leave open some lines of attack.

In particular, Congressional action around this might probably be able to pass judicial review if it were to define broad due process rights around speech censorship for explicitly defined "public forums." The only thing that SCOTUS loves more than the 1st amendment is the 14th.

No way to know until we fight it out.


> Hacker News moderation is not appealable, not auditable, does not have bright line rules, and there are no due process rights. It simply does not respect individual rights.

Exactly. HN takes the tyrannical approach to moderation: We're right and you're wrong. If you disagree, too bad.

The mob is happy to clean up any wrongthink the moderators happen to miss.

> this moderation method succeeds for Hacker News

I think that's a pretty generous statement. The quality of discussions around here has declined substantially over the past few years. In many ways it's even worse than Reddit.


You've just voiced the thing I dislike most about the HN. And I do not fully agree with "Gackle and Bell, by contrast, practice a personal, focussed, and slow approach to moderation". The faceless "we" they love so much does not appear very personal to me.

Disclaimer: I have an axe to grind having my account with 9k+ karma and dating back to 2009 banned for whatever. Despite that I am trying my best to look at the situation objectively and I still do not like it.

It is popular to compare reddit with HN and my take would be like this: reddit is like a part of the Universe where stars are still (maybe moving a bit past this though) born, and there is life and dynamics. HN, otoh, seems to be inching closer and closer to the https://en.wikipedia.org/wiki/Heat_death_of_the_universe


Agreed, HN's lack of transparency is slowly killing it. You can see the discussions becoming more one-dimensional every day.


You're being downvoted because you're hitting on a trope/cliche that's called out in the rules, and while I doubt HN is dying from a user engagement perspective, I do believe the comments are less filled with value than they have been in the past, given how negativity dang sees conflict.

Healthy conflict is good, but I've participated in healthy conflict here and was stopped by dang because of it.


> I've participated in healthy conflict here and was stopped by dang because of it.

Where did I do that?


The majority of the time you've killed my comments.


What comments? That's not something we commonly do, unless we've banned the account.


It's rarely/ever paid off to engage with you on these issues, but if you genuinely do care, I have a 10 year history in which some of my comments have been killed, and in the majority of those comments it's been because of the very existence of disagreement, not because of the nature of how the disagreement was playing out.


Any description of why comments are killed by anyone other than the people who killed it is imputing motives based on almost no direct evidence.


I'm curious about why you're on this 10 day old thread.


It's impossible to say for sure without links, but those comments were probably killed either by user flags or by software, rather than by moderators. We don't typically kill comments outright unless we're banning the account.


This is entirely false, and I'm not sure why you're saying it. I literally have emails from you saying exactly the opposite.


As I said, I'm speaking generally. In the absence of links, which you've not provided, I have no idea which posts you're specifically talking about.

Edit: out of curiosity, I looked through your history. All the dead comments back to about 2014 were killed by user flags. Before that, there are a bunch of dead comments but I didn't see signs that moderators had killed them; my guess is that you were banned for a while.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: