I don't think it's just reddit, I believe these systems are prone to triggering some primitive human instincts towards group interaction. I can think of a few plausible explanations for a discrepancy in outcomes across HN and reddit. Perhaps hidden scores or the per-comment floor HN uses suppresses it. Perhaps HN attracts a particular sort of personality while reddit attracts a more representative slice of humanity. Maybe reddit is harder to moderate, has worse moderator tools, worse mods, or just too many people. I'm not sure what the answer is, but one way or the other I consider these sort of systems to be failed experiments.
> Researchers from Hebrew University, NYU, and MIT explored herd mentality in online spaces, specifically in the context of "digitized, aggregated opinions".[4] Online comments were given an initial positive or negative vote (up or down) on an undisclosed website over five months.[5] The control group comments were left alone. The researchers found that "the first person reading the comment was 32 percent more likely to give it an up vote if it had been already given a fake positive score".[5] Over the five months, comments artificially rated positively showed a 25% higher average score than the control group, with the initial negative vote ending up with no statistical significance in comparison to the control group.[4] The researchers found that "prior ratings created significant bias in individual rating behavior, and positive and negative social influences created asymmetric herding effects".[4]
> “That is a significant change”, Dr. Aral, one of the researchers involved in the experiment, stated. “We saw how these very small signals of social influence snowballed into behaviors like herding.”[5]
> Researchers from Hebrew University, NYU, and MIT explored herd mentality in online spaces, specifically in the context of "digitized, aggregated opinions".[4] Online comments were given an initial positive or negative vote (up or down) on an undisclosed website over five months.[5] The control group comments were left alone. The researchers found that "the first person reading the comment was 32 percent more likely to give it an up vote if it had been already given a fake positive score".[5] Over the five months, comments artificially rated positively showed a 25% higher average score than the control group, with the initial negative vote ending up with no statistical significance in comparison to the control group.[4] The researchers found that "prior ratings created significant bias in individual rating behavior, and positive and negative social influences created asymmetric herding effects".[4]
> “That is a significant change”, Dr. Aral, one of the researchers involved in the experiment, stated. “We saw how these very small signals of social influence snowballed into behaviors like herding.”[5]
https://en.wikipedia.org/wiki/Herd_mentality