Hacker News new | past | comments | ask | show | jobs | submit login

[flagged]



I think this is exactly what they’re talking about. In no way did the grandparent endorse or glorify Wallace; his implication was that Wallace would have been less popular had he and therefore his opponents been heard. But in this environment, merely suggesting a person should be heard is taken as a strident endorsement.

Another more extreme example would be the Nazis. They went through early political suppression and existed in an environment where organized political street violence was already common. Who knows what the counterfactual was, but censoring and punching Nazis did not stop their rise. The Coming of the Third Reich by Richard Evans is a good book on this. It certainly seems that extremism is fostered by such environments.


> censoring and punching Nazis did not stop their rise.

Precisely. Nadine Strossen, president of the ACLU board from 1991-2008, argues that censorship helped the Nazis [1]

> "In the Weimar Republic there were laws very similar to the anti hate speech laws that still exist in Germany today. And they were very strictly enforced, there was an umbrella of Jewish organizations in the Weimar Republic, the head of which did a study. They said that these laws are by and large being strictly enforced, the prosecutions are being capably handled, there were many convictions, including of Nazis, and the Nazis loved the propaganda. They got far more attention than they otherwise would have, became free speech martyrs, actually had posters saying, 'In all of Germany why is this one man silenced?' They gained sympathy and attention that they otherwise never would have."

[1] https://youtu.be/J1iZffRFs8s?t=2838


A written version of their (by which I mean Strossen and other's like Haidt involved in FIRE movement) "anti-hate laws helped the Nazis" argument is found here:

https://www.thefire.org/would-censorship-have-stopped-the-ri...

> Considering the Nazi movement’s core ideology, as espoused by Hitler in “Mein Kampf,” rested on an alleged conspiracy between Jews and their sympathizers in government to politically disempower Aryan Germans, it is not surprising that the Nazis were able to spin government censorship into propaganda victories and seeming confirmation of their claims that they were speaking truth to power, and that power was aligned against them.

I don't find it particularly convincing, but it does explain the strategy of complaining about free speech and censorship by "the jews" has a long history of success, with Nazis. Everyone else thinks, "oh these people are Nazis". But Nazis think, "It's a Jewish Conspiracy to silence the truth", because they are Nazis.


> I don't find it particularly convincing

Can you elaborate on what you find unconvincing about appeals to open discourse? Are you saying you think censorship is more effective?


The Nazis were helped by a lot of things.

Suggesting that anti-hate speech laws are worth bringing up in this context is like mentioning Hitler was a vegetarian or banned smoking and seems to be sourced to a cartoonist, rather than a historian.

Hitler went to jail for 'high treason' after an attempted coup.

> Goebbels' tactic of using provocation to bring attention to the Nazi Party, along with violence at the public party meetings and demonstrations, led the Berlin police to ban the Nazi Party from the city on 5 May 1927.[65][66] Violent incidents continued, including young Nazis randomly attacking Jews in the streets.[62] Goebbels was subjected to a public speaking ban until the end of October.[67]

Goebbels, the victim of oppressive laws that stop you from randomly attacking people in the street. Why have we not learned this lesson from history? If you stop them attacking people violently in the streets, and removing the government, then it's your own fault what happens next.


I mentioned Evans’ The Coming of the Third Reich because it’s critical for understanding what happened here to know the historical context. Those measures were not targeted at the Nazis specifically, nor were the tactics developed by the Nazis in a vacuum.. Organized political street violence was already normal in Weimar Germany before the Nazis had more than a dozen members. They came into existence at a time when violence was a socially acceptable method to shut people up. They faced political/legal repression, censorship, and arrests before the coup.

I don’t know if it helped them amass the numbers/backing and cement the ideology that led to the attempted coup, but it definitely didn’t stop it. You can definitely see how pre-existing political street gangs made it easier to justify forming their own street gangs.


The point is, countering negative ideas with suppression does not work. Countering speech with more speech is better. Roger Baldwin, founder of the now gone-astray ACLU, may have put it best [1],

> Host: "What possible reason is there for giving civil liberties to people who will use those civil liberties in order to destroy the civil liberties of all the rest?"

> Roger: "That's a classic argument you know, that's what they said about the nazis and the communists, that if they got into power they'd suppress all the rest of us. Therefore, we'd suppress them first. We're going to use their methods before they can use it."

> "Well that is contrary to our experience. In a democratic society, if you let them all talk, even those who would deny civil liberties and would overthrow the government, that's the best way to prevent them from doing it."

Regarding your claim that this is only "faintly" related to free speech, you should know that Jonathan Haidt is very near to that issue. He co-authored a book [2] with the current president of the Foundation for Individual Rights and Expression (FIRE), Greg Lukianoff. See also: Jonathan Haidt on the moral roots of liberals and conservatives [3]

[1] https://youtu.be/ND_uY_KXGgY?t=1225

[2] https://www.amazon.com/Coddling-American-Mind-Intentions-Gen...

[3] https://vimeo.com/27861464


> The point is, countering negative ideas with suppression does not work.

This is the core of your argument, and if true, it’s unassailable. However, I don’t believe it’s true. Do you have evidence that it is? I don’t mean well-crafted arguments by respected people, but actual evidence.

For my part, I’ve seen evidence to the contrary. Elsewhere in this thread, there’s a study that cites the positive benefits of deplatforming on Reddit. I’ve also observed that online forums invariably turn into cesspits if they aren’t moderated. The larger the forum, the more aggressive that moderation has to be, to the point of banning and shadowbanning. HN does it. (For that matter, downvoting is another form of deplatforming, in that it literally pushes other people’s opinions out of sight.)

So there’s two points of evidence that make me believe that deplatforming is effective: one is academic; and one is the personal observation, that I think we’ve all shared, that moderated fora work better than unmoderated fora. What evidence do you have? Please summarize rather than just posting links.


> This is the core of your argument, and if true, it’s unassailable. However, I don’t believe it’s true. Do you have evidence that it is? I don’t mean well-crafted arguments by respected people, but actual evidence.

> For my part, I’ve seen evidence to the contrary. Elsewhere in this thread, there’s a study that cites the positive benefits of deplatforming on Reddit.

I'm familiar with that study and I'm pretty sure they acknowledged that they only measured activity of individuals who remained on Reddit. It's really hard to definitively say what happens when groups move to private forums where you can no longer track their conversations. Also, what people say in public vs. private can be different [1] [2], and I think periods when that is true (such as right now) should be cause for consideration about whether or not censorship contributes to that. I'd argue that state or widespread cultural censorship does contribute to self-censorship by chilling conversations, and that people's views don't change when you block them from view. It's like sending someone to prison vs. rehabilitation. You might still argue that the views can't spread as easily, and I would argue against that too.

When groups are deplatformed, they find their own ways to communicate, either by joining private groups on services like Telegram or by creating their own platforms.

Then they are outside your sphere of influence and harder to reach. We are pushing them in this direction and I think we will end up regarding that as a mistake. But it's not like this is the first time that's happened.

You can join my talk next Wednesday, October 12th [3] to learn more about what I think with sources. It's the first one listed. This particular question is not the focus, but it is related. I'd also recommend listening to free speech defenders such as Ira Glasser, Nadine Strossen, Greg Lukianoff, and Jonathan Rauch, to name a few [4]. They've all spoken at length about this via many forms of media, books, podcasts, conferences, etc. Anyone interviewed on the So to Speak podcast is also great, particularly the earlier episodes where they bring in the older generations who've been defending free speech for their entire lives.

The evidence is there, but every time there is a new technology it takes time to collect. You can instead consider a principled approach based on what you know works between individuals. Jonathan Rauch makes a great case for this in his books and speaking.

> I’ve also observed that online forums invariably turn into cesspits if they aren’t moderated.

I think this is true! Yet, the more we censor, the more concentrated the echo chambers become, both the ones you agree with and the ones with whom you disagree. So IMO we must stop regarding "ugly" forums as a bad thing to be avoided. We all have difficult conversations in the real world, and I think curating the online world to look nice and pretty only provides more evidence to the idea that our current "solution" to online disagreements isn't working and appears to be infecting the real world.

> The larger the forum, the more aggressive that moderation has to be, to the point of banning and shadowbanning. HN does it.

That is evidence that it's popular, not that it is a good idea.

> (For that matter, downvoting is another form of deplatforming, in that it literally pushes other people’s opinions out of sight.)

I agree, and I think that has the unfortunate side effect of hiding some useful rebuttals, as I mentioned elsewhere in this thread [5].

> So there’s two points of evidence that make me believe that deplatforming is effective: one is academic; and one is the personal observation, that I think we’ve all shared, that moderated fora work better than unmoderated fora. What evidence do you have? Please summarize rather than just posting links.

I'd say the jury's out on how to effectively moderate fora. We're just at the start of this era and it's been quite a bumpy ride. Moderation, at the very least, should be transparent to the authors of the content, and that's not the case across all of the major platforms, including this one.

[1] https://youtu.be/-ByRjHwknbc?t=2892

[2] https://www.axios.com/2022/08/17/americans-voters-private-be...

[3] https://truthandtrustonline.com/pre-conference-workshops/

[4] https://www.reddit.com/r/reveddit/comments/wxfjvy/im_on_a_po...

[5] https://news.ycombinator.com/item?id=33055630


Thanks for your thoughtful reply. You didn't actually answer my request for evidence, though.

I think is the study we've been discussing: https://seclab.bu.edu/people/gianluca/papers/deplatforming-w...

This is my understanding of their analysis, based on a fairly shallow read:

> Are accounts being created on an alternative platform after being suspended?

A: Yes, 59% of Twitter users and 76% of Reddit users moved to Gab.

> Do suspended users become more toxic if they move to another platform?

A: Reddit users became more toxic on Gab. 60% of Twitter users became less toxic and 20% became much more toxic, although the most toxic posts contained hatred against Twitter and complaints that their free speech and rights had been denied. (Toxicity was determined using Google's Perspective API.)

> Do suspended users become more active if they move to another platform?

A: Yes. A manual inspection determines that at least some of that increased activity is complaints about being suspended.

> Do suspended users gain more followers on the other platform?

A: Although users tend to become more toxic and more active after they move to the alternative platform, their audience decreases.

I think you could read this either way. Deplatforming is ineffective because it "radicalizes" those have been deplatformed. Or; deplatforming is effective because it reduces the spread of toxicity. Your post above is mainly focusing on the former; my post focused mainly on the latter.

The jury's still out, as you said. Personally, I'll continue to lean in favor of moderation, if only for the selfish reason that unmoderated communities are nasty places, and I want to participate in communities that "bring me joy," to indulge in a Kondo-ism. I think we've shown pretty conclusively, though, that your argument "The point is, countering negative ideas with suppression does not work" is premature at best.

I'll let you have the last word. Best wishes.


> You didn't actually answer my request for evidence, though

I agree with the NYT that censorship is rooted in fear [1].

Evidence is aplenty of the benefits of open discourse. In real-world places where open discourse is encouraged, people and ideas thrive. Also, saying "you need to be protected from other people's words" is not a winning argument in the public sphere. People want to be trusted to make their own decisions about how to feel, not have the importance of speech dictated to them.

It's really only a small minority who seek protection against certain viewpoints, and they too want to be able to express themselves. Unfortunately, censorship is also used against them, often with prejudice and without their knowledge [2]. History has shown how this has happened over and over, for example in "Don't Be a Sucker" (1947) [3]. If you choose to ignore it, that is your prerogative. History is evidence.

> I think is the study we've been discussing

Thanks for linking it, that's not the one I had in mind. To expand on my previous comment about how we should accept "ugly" forums, I think measuring toxicity is problematic. For one thing, it's a subjective measure. One man's trash is another's treasure. For example, here's an article from someone making a case in favor of Kiwi Farms [4]. But also, censorship can chill what people state publicly. I already shared Axios's write-up of a recent study that shows that these days, what people say publicly does not align with what they say privately [5].

> The jury's still out, as you said. Personally, I'll continue to lean in favor of moderation, if only for the selfish reason that unmoderated communities are nasty places, and I want to participate in communities that "bring me joy," to indulge in a Kondo-ism. I think we've shown pretty conclusively, though, that your argument "The point is, countering negative ideas with suppression does not work" is premature at best.

FWIW, I think moderation is fine if the author is informed of actions taken against their content. That is not happening consistently on any of the platforms though, and hundreds of millions, perhaps billions, of users are impacted. Load 10 tabs of randomly selected active Reddit users on Reveddit [6]. Five or more will have had a comment removed within their first page of recent history. Almost none of these will have been notified, and all of their removed comments are shown to them as if they're not removed. I just did it and got 7. Reddit last reported 450 million monthly active users. And, Facebook moderators have a "Hide comment" button that does the same thing:

> "Hiding the Facebook comment will keep it hidden from everyone except that person and their friends. They won’t know that the comment is hidden, so you can avoid potential fallout." [7]

It's hard for me to believe that this has had no negative impact on discourse, particularly when our recent difficulties communicating across ideologies seem to align quite well with the introduction of social media. Things like this 1998 Firing Line episode [8] simply are not happening today. The depth of conversations these days is shallow and combative.

> I'll let you have the last word. Best wishes.

I will reject (graciously, I hope) your offer. I think continued discussion is the way forward.

[1] https://www.nytimes.com/2022/09/10/opinion/schools-banned-bo...

[2] https://www.reveddit.com/about/faq/#need

[3] https://www.youtube.com/watch?v=23X14HS4gLk

[4] https://corinnacohn.substack.com/p/the-world-should-not-need...

[5] https://www.axios.com/2022/08/17/americans-voters-private-be...

[6] https://www.reveddit.com/random

[7] https://www.agorapulse.com/blog/hide-comments-on-facebook/#o...

[8] https://www.youtube.com/watch?v=2KQmPeM0Gmo


[flagged]


> How has Chomsky's strategy worked?

It's worked incredibly well. The United States - the country with easily some of the most well protected free speech laws - continues to be one of the most diverse and welcoming countries in the world. It's one of the most desired places for people to immigrate to, and continues to have a large foreign born population. By comparison, look at how a single digit percentage influx of foreigners rattled Europe. When racists do gather it usually results in plummetting support for their causes. This was the case with the Unite the Right rally: following the rally support for the movement dropped considerably.

By what metric has Chomsky's strategy failed? You seem to be postulating this as fact, without anything backing it up.


"By comparison, look at how a single digit percentage influx of foreigners rattled Europe."

Islam played a role in that. Europe and Islam have been on mostly fighting terms since around 700 AD. There is much smaller cultural difference between the mostly Hispanic immigration into the US and the mainstream American culture than there is between the mostly secular European cultures and people from, say, Afghanistan, who express significant support for things such as Shari'a law.


Thesis is that rational discourse would triumph over lunacy.

Did it?


It's amusing that Chomsky at least says one sensible thing, and that's what causes the far left to reject him.


Chomsky is the far left, at least in America. Of course the neoliberals (center-right) rejected him.


Seems like it's worked well? Wikipedia tells me that he's "one of the most cited scholars alive" and "has influenced a broad array of academic fields"; I'm not personally familiar with his academic work, but most of my friends strongly endorse his perspectives on capitalism and foreign policy. It's hard to see how compromising on free speech could have led his ideas to be more successful.


> That's a classic argument you know, that's what they said about the nazis and the communists, that if they got into power they'd suppress all the rest of us.

With the small difference being that those groups did not suppress speech by refusing to invite speakers to an event. Let's not put academia memberships and pogroms in the same category.


I wasn't comparing academia to nazis. Rather, I was saying that even in the most extreme situations e.g. nazis and communists, open and civil discourse is still better than censorship.

It's also worth noting that FIRE was expressly founded to deal with free speech violations in higher education. FIRE's president often discusses this, for example in a recent interview with Nick Gillespie [1]. FIRE recently expanded to cover all free speech issues, but that is where they got started. They've taken over a role that the ACLU has largely abandoned, since they now construe rights to be in conflict with each other, as former Executive Director Ira Glasser mentions [2] with the ACLU's new guidelines [3]:

> "The guidelines are designed to assist in consideration of the competing interests that may arise when such conflicts emerge. The guidelines do not seek to resolve the conflicts, because resolution will virtually always turn on factors specific to each case."

> "The potential conflict between advocacy for free speech and for equal justice in the fight against white supremacy is especially salient, but by no means unique in presenting tensions between ACLU values."

But you're not supposed to construe rights as being in conflict with each other, as implied by the 9th amendment [4]:

> The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people

Like balancing form and function, one should not take away from another.

[1] https://podcasts.apple.com/us/podcast/greg-lukianoff-saving-...

[2] https://youtu.be/x0Lc5b8Flto?t=87

[3] https://www.aclu.org/sites/default/files/field_document/aclu...

[4] https://en.wikipedia.org/wiki/Ninth_Amendment_to_the_United_...


He may be right with the general idea, but the point was that his phrasing needs some serious work. The classic argument is really that they should be censored so they can't easily organise and kill us. If he doesn't say that, I don't know if he realises what stakes he's talking about. (he probably does, but... say it)

I'm annoyed, because there are lots of people who don't realise this is literally about survival for others, not just about free speech ideas.


> some serious work. The classic argument is really that they should be censored so they can't easily organise and kill us.

There's a difference between words and actions. Words fo not kill. No, words are not violence no matter how much people try to claim otherwise.

Furthermore, censoring people does nothing to reduce a groups ability to kill. Taking away someone's public voice does not take away their capacity to carry out violence. Disinviting a speaker or banning a book does not magically make people's guns disappear.


It was a classic argument made during his lifetime, from the 1910s until 1981 when that video was made.


[flagged]


So, I like the voting system on social media because it surfaces good comments.

But I don't like when the vote ranks stifle good retorts, which is what has now happened here. Logged out users won't see this entire thread of replies because the comment to which you are replying is marked "dead".

What's the solution?


If you have sufficient HN karma, you can click the time stamp and then the “vouch” link. It will bring the comment back from the dead.


Thanks, I did so in this case, though in general I'm wary of that button. The system could count against me if I "vouch" for something that people later downvote. We don't knowfor sure what happens because they don't tell us how it works.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: