Hacker News new | past | comments | ask | show | jobs | submit login

Large scale and super sick perpetrators exist (as compared to small scale ones who do mildly sick stuff) because Facebook is a global network and there is a benefit to operating on such a large platform. The sicker you are, while getting away with it, the more reward you get.

Switch to a federated social systems like Mastodon, with only a few thousand or ten thousand users per instance, and perpetrators will never be able to grow too large. Easy for the moderators to shut stuff down very quickly.




Tricky. It also gives perpetrators a lot more places to hide. I think the jury is out on whether a few centralized networks or a fediverse makes it harder for attackers to reach potential targets (or customers).


The purpose of facebook moderators (besides legal compliance) is to protect normal people from the "sick" people. In a federated network, of course, such people will create their own instances, and hide there. But then no one is harmed from them, because all such instances will be banned quite quickly, same as all spam email hosts are blocked very quickly by everyone else.

From a normal person perspective on not seeing bad stuff, the design of a federated network is inherently better than a global network.


That's the theory. I'm not sure yet that it works in practice, I've seen a lot of people on Mastodon complaining about how as a moderator, keeping up with the bad services is a perpetual game of whack-a-mole because everything is access on by default. Maybe this is a Mastodon specific issue.


That's because Mastodon or any other federated social network hasn't taken off, and so not enough development has gone into them. If they take off, naturally people will develop analogs of spam lists and SpamAssassin etc for such systems, which will cut down moderation time significantly. I run an org email server, and don't exactly do any thing besides installing such automated tools.

On Mastodon, admins will just have to do the additional work to make sure new accounts are not posting weird stuff.


Big tech vastly underspends on this area. You can find a stream of articles from the last 10 years where BigTech companies were allowing open child prostitution, paid-for violence, and other stuff on their platforms with little to no moderation.


> Switch to a federated social systems like Mastodon, with only a few thousand or ten thousand users per instance, and perpetrators will never be able to grow too large.

The #2 and #3 most popular Mastodon instances allow CSAM.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: