Hacker News new | past | comments | ask | show | jobs | submit login

I was going to mention the same thing. This happens on Reddit too.

If your posts are unpopular for any reason you're automatically penalized. Doesn't matter if you're right or wrong, you're penalized for posting anything that people disagree with or don't want to hear.

That's why sites like Reddit and HN will always be echo chambers. Dissenting voices are automatically silenced. Not 100% of the time, but often enough that most will probably never waste their time posting.




I find that Reddit is generally much worse about this kind of thing; perhaps it is the culture or maybe it is the fact that votes are public. If you say something people don’t like, they’ll quickly pile on you. For some reason people there really like going with the flow, and you can’t even reply to clarify without them coming after you. I have found it much less likely that this happens on Hacker News, and people are generally more willing to listen to a comment regardless of how others felt about it.


I don't think it's just reddit, I believe these systems are prone to triggering some primitive human instincts towards group interaction. I can think of a few plausible explanations for a discrepancy in outcomes across HN and reddit. Perhaps hidden scores or the per-comment floor HN uses suppresses it. Perhaps HN attracts a particular sort of personality while reddit attracts a more representative slice of humanity. Maybe reddit is harder to moderate, has worse moderator tools, worse mods, or just too many people. I'm not sure what the answer is, but one way or the other I consider these sort of systems to be failed experiments.

> Researchers from Hebrew University, NYU, and MIT explored herd mentality in online spaces, specifically in the context of "digitized, aggregated opinions".[4] Online comments were given an initial positive or negative vote (up or down) on an undisclosed website over five months.[5] The control group comments were left alone. The researchers found that "the first person reading the comment was 32 percent more likely to give it an up vote if it had been already given a fake positive score".[5] Over the five months, comments artificially rated positively showed a 25% higher average score than the control group, with the initial negative vote ending up with no statistical significance in comparison to the control group.[4] The researchers found that "prior ratings created significant bias in individual rating behavior, and positive and negative social influences created asymmetric herding effects".[4]

> “That is a significant change”, Dr. Aral, one of the researchers involved in the experiment, stated. “We saw how these very small signals of social influence snowballed into behaviors like herding.”[5]

https://en.wikipedia.org/wiki/Herd_mentality




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: