That's the biggest limit of moderated forums, they only reflect the opinion of the most active groups who can steer the discussion helped by moderators who benefit from rewarding the largest groups instead of the best comments
If moderation was visible and moderators were forced to leave a note about why the moderation took place it would be a real discussion platform
HN is not
Slashdot is a lot better than many others in this regards, but it's not popular anymore and you can't make money on it, while a lot of people leave by posting shit on Reddit
That's just the nature of discussion basically anywhere. Every person and group have lines that you're not allowed to cross, and when people cross them you just get unproductive blow-ups, and someone or some sub-group will leave.
If you tried to herd state socialists/tankies and anarcho-capitalists/voluntarists into the same discussion space, they're so violently opposed they'd just be constantly screaming epithets at each other. That's not a useful thing.
Not to mention even when you have ideologically-aligned folks, some people are just anti-social dickwads who will constantly pick fights or argue in bad faith. I don't understand some people's seeming obsession with defending this kind of person, Some people just suck and everyone else is better off if they're not around. A private space is under no obligation to tolerate a poster who adamantly refuses to get along.
> If I'm having a discussion with people in real life I decide what I accept or not, there's no third party that decides for me what is right.
Yes, and that works fine because there's no platform there, just a 1 on 1 or small group conversation. You can still easily replicate this, unmoderated, with email or various messaging apps.
Once you can talk to potentially hundreds or thousands of people at once, once there's a platform, that model breaks down. Bad actors who would be uninterested in trolling single individuals are very interested in trolling hundreds at a time. And nobody wants to "walk away" from an otherwise good community because of handful of very loud people are spouting hate there.
Any platform that's both popular and unmoderated will eventually be dominated by extreme content, and will push out normal people, who will go somewhere that's popular and moderated.
There's no intrinsical limitation on the number of participants
Don't you like what a user says?
You can ignore them
Don't you like what some instance does, you can block it.
Any platform that is popular has an editorial board and doesn't want you to say things they don't like.
Simple as that.
Newspaper had no comments sections because it's silly to comment the news, they already decide what to publish and what not.
They already chose who to talk to, there's no point in discussing when you can only comment what someone else wants you to talk about.
Have you seen today on HN a post about exactly 40 years ago, when an Italian civil plane, the Itavia Flight 870, was shot down by a fight between NATO and Libyan fighter jets and 81 innocent people died?
You won't, because it's gonna be flagged as politics.
But you're going to read about every cat fight between über rich silicon valley founders because that's not politics for them, it's what they wanna talk about.
Trolling is a problem for the platform, not for the users.
I don't mind trolls, if I can decide who they are and silence them.
If they do it for me, it's censorship.
Censorship is not bad per se, but it's not done in my name, it's only in the platform's interests.
Do platforms ever ask users what do they think about banning someone?
> I don't mind trolls, if I can decide who they are and silence them.
This only works for small communities. You can't feasibly block the literally thousands of trolls and petty assholes that are posting on Reddit every day without that task consuming all your time. Multiply that by every single user having to do it personally and it gets even sillier.
There's a reason basically every popular platform is moderated on some level, and it's not because of some grand meta-moderator conspiracy.
Moderation is near-universally used because it works. Non-moderating doesn't work for conversations that eclipse some size. Disliking how moderators behave doesn't change that.
> Ad blocking works because I decide what to block
Ad blocking isn't a community or discussion forum, and most people just use whatever blacklists some 'authority' comes up with.
I guess the equivalent for a forum would be where you could not only block users (which is already common), but also share/combine blocklists. That's an interesting idea.
I think you'd run into the WoW sharding problem where it creates a sort of dissonance where you're nominally in the same space but also not in the same space at the same time. Still, would be cool to at least experiment with.
It's a user's side tool to remove unwanted content based on community generated rules
It's content moderation nonetheless
The error IMO is to think that the current implementation, which is also very young and immature, it's the best possible
It isn't
HN is not really a community, it's a platform run by a commercial entity, with (legit) interests
Imagine if HN was just a node of a larger federated network
They could decide what to post on their node(s) and which comments to remove
I could run my instance and subscribe to their feed or their same source feeds and make different choices
People could share blocklists, whitelists, favourites, ratings and everything else and decide what to use and what not
HN would still be popular, but other nodes could benefit from having more freedom or making different choices
Now HN (and every other UGC out there) is an all or nothing experience
Facebook is facing an ad boycott because they can't moderate the platform the way corporations want, it means advertisers are the ones who ultimately decide which content is valuable and which is not, sometimes it can coincide with what users want, but more often than not it doesn't.
But if we produce the content (like this conversation we're having) we should have control over it, and be able to reproduce it on a instance we control and continue it ad libitum even when HN decides our karma doesn't allow more than a few comments a day or one of us is shadow banned for reasons completely unrelated to what we are discussing right now or because it looks like spam to them or any other reason they think it needs moderation.
It's their right if the content is free for someone else to pick up and they are not responsible for what happens on other nodes.
It should be part of giving back to the community, you generate content for us, we moderate it like a DJ selects music for the listeners, but you can make your own playlists if you want to, because we don't make the music, we just mix it.
Nobody said HN should not moderate their public instance, they have people to respond to, it simply shouldn't be the only instance
If I had a feed of every comment and every link posted, I could read them and make my own rules
This post is being downvoted but it's a well known feature of HN that heated discussion are immediately flagged and they disappear very quickly
If a platform wants people to engage but don't want people to be passionate about their beliefs, it is not a discussion platform, it's a walled garden for a certain type of opinions.
Does it make discussions better? probably, if you already agree with the rules or can (or want) to follow them.
What if you can't?
What if a topic is divisive because on HN people refuse to acknowledge that the general view on HN is simply wrong?
Nobody will ever know.
Imagine a person going to a vegan restaurant asking for a steak. How long will it take to get kicked out?
That's a feature, if you are vegan, but it's not desirable for every restaurant, especially if they want (or like) to serve a broad range of customers.
Of course HN can say that this is exactly what they want, but what about the discussion about "is what they want right?"
I'm talking about HN because one of the post mentioned it like a good example of a free and open platform, but a platform that bans users for talking about politics is not really a good example of good moderation.
Moderation should happen on the receiving side, when it happens on the publisher's side it's called editing.
Any news outlets has editorial boards, there's nothing wrong about it, but it should be clear that the opinions expressed on an editorialised platform are not free.
Decentralisation has, among the many downsides, the advantage of being controlled by the party who receive the content, not the one who generates it.
That's the biggest limit of moderated forums, they only reflect the opinion of the most active groups who can steer the discussion helped by moderators who benefit from rewarding the largest groups instead of the best comments
If moderation was visible and moderators were forced to leave a note about why the moderation took place it would be a real discussion platform
HN is not
Slashdot is a lot better than many others in this regards, but it's not popular anymore and you can't make money on it, while a lot of people leave by posting shit on Reddit
Worse is better always wins