Hacker News new | past | comments | ask | show | jobs | submit login

An echo chamber is the only real option. Computers are unaware of concepts of good or evil, so at best a moderation algorithm can do is enforce a certain viewpoint. The question is what viewpoint? The viewpoint of the consensus of users, or the viewpoint of the community leadership?



I don't think moderation should be done by algorithm, as soon as you give the task back to humans you're much more capable of shades of gray and thoughtful, real moderation. Humans have been moderating public spaces for thousands of years, we're more than up to the task if a little bit of care is put into the implementation.


But humans are VERY very slow at this. Even our world-class moderation systems (the legal system in many countries) is excruciatingly slow, often taking months or years for a single decision.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: