Hacker News new | past | comments | ask | show | jobs | submit login

Would a social network, where every post is moderated, have the same issues?



Very good question.

Moderation is the only known effective solution. Some times.

FAANGs and others abandoned human moderation because it "doesn't scale".

So what? When did we decide that scale was more important than civility?

--

I also have questions.

What is the feedback loop? For every medium, every style of communication?

How do we break those loops into discrete steps? Then where can we add friction? To slow down the negative psychological and sociological pathologies.

Defuse, disable those dopamine hits we all get from these social interactions. Maybe even figure out how to make it a positive feedback loop.


Google decided scale was more important than civility when they positioned text search - Pagerank - as a competitor to curation, rather than as a beneficiary of curation.

We used to have curated content indexes with human moderators guiding them - in print, Yahoo/DMOZ, AOL keywords, web rings, LiveJournal. Those required significant human labor compared to algorithmic text searching, and were not able to scale as rapidly as content creation tools did.

Was that lack of scaling a problem? No, not necessarily. Some percentage of people enjoy curation (thus Pinterest) and we could have celebrated their efforts and given them top billing in search.

Instead, we “democratized” search by harvesting all of those human rankings and feeding them into a machine algorithm that produces seemingly better-than-human results. Unfortunately, in doing so they did not highlight whose curation led to content being shown, and so curation became less popular over time.

Unfortunately, that curation is what led to Pagerank being so valuable. Without it, spam and liars and malicious activities have infected all “search” and “ranking” systems. Without human curation distinguishing “valuable” from “unevaluated”, search does not scale either.

We did society great harm when we sidelined curation, and no amount of machine-learning algorithms will heal that wound.


Amen.

Recommenders, like bureaucracies, are misanthropic (anti-human).

The rules are meant to remove human judgement. While denying the baked in bias and dysfunction of the imposed ruleset.

Per Goodhart's Law, they arbitrarily state some things are worthwhile & meaningful, and everything else are not.

Black boxes which thwart inspection, transparency, accountability, explanation.

--

Forgive me for flogging this horse; I do have a point.

Also missing from online social networks are the concepts of fair and impartial adjudication.

Curation, adjudication, transparency, accountability... I'm sure we're omitting many other missing features. Because we're too close to the problem.

Going meta meta here: the common trait of all these "regulation arbitrage" unicorns is they profit by the destruction of our society's laws, checks & balances, social norms, and so forth.


Can adjudication be impartial with regards to civility?


I mean impartial in the sense that the adjudicators (judge, jury, arbiters, moderators, whomever) are not beholden to the publisher. So a third party. Like our court system is supposed to be.

IMHO, there's not enough daylight between ombudspersons and their paymasters.

I have no fixed opinion for standards of civility. I'm just trying to capture the notion that some dialog, rhetoric is out of bounds, as determined by the intended audience (context).


Just a technicality, but a "positive feedback loop" is probably the very thing that you might want to avoid.

https://en.wikipedia.org/wiki/Positive_feedback


Ya, suboptimal phrasing.

I mean in contrast to a negative feedback loop. Socially, negative or positive denotes impact. Virtuous vs vicious cycles.

Suggestions for better phrases?


Nope, I've been scratching my head about those for a long time now, but haven't found anything better than adding a note wherever there might be a confusion... :/


What do you mean by "every post is moderated" ?


One or more human moderators are paid to review every post and comment before they’re published, making an accept/reject decision using criteria.


Oh, I didn't think you were serious about that - this is probably not feasible because it would be insanely "expensive" ? (I mean, Slashdot seems to have found the best middle ground here, haven't they ?)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: