Social media platforms are designed to show you content that is more likely to get you to engage with the site by sharing the content, liking it, or commenting on it.
This creates an incentive for people to write provocative and controversial content because it's one of the easiest ways to get attention, because the platform assumes that if you're getting reactions from people, your content deserves to be seen by more eyeballs.
I've thought a lot about this and I think there are many changes, mostly design changes, that would decrease toxicity on social media platforms. I think most platforms wouldn't implement these ideas because they would decrease engagement and ad revenue on their sites.
Here are some of my crazy ideas (in paragraphs because I had trouble formatting bullets):
Allow negative reactions from normal users to make content less visible to others. The advantage of a downvote system is that it takes the burden of content moderation off of moderators and puts it more on the people who consume the content most. The disadvantage of a downvote system is that often, when people are downvoted, they don't learn anything because they often don't know why they were downvoted and they have to guess. Maybe when you downvote someone, you should have to pick a reason and then a breakdown of the downvote reasons should be available transparently for everyone to see.
Content recommendation improvements. YouTube's algorithm is infamous for converting people from moderates to Neo-Nazis. Part of this is that the algorithm that shows you what video you should watch next just shows you what videos keep people watching YouTube. It makes no effort to separate news and facts from opinion and infotainment, so the lines get extremely blurry. If there were separate communities on YouTube for news on different topics, or for opinion and infotainment on various topics, it could shield people who really just want the news from being exposed to three-hour-long alt-right rants, if that is not what they were looking for in the first place.
We need more awareness of emotional manipulation on social media and news content. As an example, if a headline has the word "disturbing" in it, I would call that emotional manipulation because the headline is trying to do your thinking for you, reach conclusions for you, and tell you what to think and feel before you had a chance to read the article and digest it. Too often, we vaguely point to "education" as a way to protect people from sharing toxic content or disinformation online. The fact is that educated people are human too, just as emotional as anyone else, and are not going to be in a fact-checking mindset when they see a social media post with a headline or image that makes them feel sad, angry, or disgusted. We need a platform design that lets ordinary people flag, downvote, or otherwise participate in the fight against emotionally manipulative content.
We need to improve the moderation process so that people trust moderators more, can understand their decisions more, and can contest their decisions if necessary. I think that when a moderator removes a post, it shouldn't disappear. It should be replaced with a list of the rules that were violated for everyone to see. And moderators should be able to hide a post temporarily to give the author a chance to edit it and resubmit it. Also, the first time someone posts in a forum, when they write a comment, they should see sitewide rules and community rules right before they submit, giving them a chance to go back and revise their post. (This mainly applies to a place like Reddit, but I think a platform like Facebook would also be better off with community rules.) These changes would make it so that people understand moderation more, can see a more transparent process, will stop being surprised by moderation, and will overall have a more positive experience with moderators.
On Facebook, Twitter, and YouTube, I believe some of the problem is that the platform encourages you to follow people. When you follow personalities instead of topics, it introduces a lot of potential toxicity and content moderation problems. There's no bechmark for civility, nothing you're encouraged to talk about, just opportunities and rewards for creating drama. If these platforms made it harder to make everything oriented around personal brands, and easier to engage in communities that at least have a stated purpose, that would probably help to discourage toxicity, because in a community with a stated purpose, it's easier for a community member to say, "That type of content doesn't belong here, this community is supposed to be about x." Where x is a fairly uncontroversial topic.
This creates an incentive for people to write provocative and controversial content because it's one of the easiest ways to get attention, because the platform assumes that if you're getting reactions from people, your content deserves to be seen by more eyeballs.
I've thought a lot about this and I think there are many changes, mostly design changes, that would decrease toxicity on social media platforms. I think most platforms wouldn't implement these ideas because they would decrease engagement and ad revenue on their sites.
Here are some of my crazy ideas (in paragraphs because I had trouble formatting bullets):
Allow negative reactions from normal users to make content less visible to others. The advantage of a downvote system is that it takes the burden of content moderation off of moderators and puts it more on the people who consume the content most. The disadvantage of a downvote system is that often, when people are downvoted, they don't learn anything because they often don't know why they were downvoted and they have to guess. Maybe when you downvote someone, you should have to pick a reason and then a breakdown of the downvote reasons should be available transparently for everyone to see.
Content recommendation improvements. YouTube's algorithm is infamous for converting people from moderates to Neo-Nazis. Part of this is that the algorithm that shows you what video you should watch next just shows you what videos keep people watching YouTube. It makes no effort to separate news and facts from opinion and infotainment, so the lines get extremely blurry. If there were separate communities on YouTube for news on different topics, or for opinion and infotainment on various topics, it could shield people who really just want the news from being exposed to three-hour-long alt-right rants, if that is not what they were looking for in the first place.
We need more awareness of emotional manipulation on social media and news content. As an example, if a headline has the word "disturbing" in it, I would call that emotional manipulation because the headline is trying to do your thinking for you, reach conclusions for you, and tell you what to think and feel before you had a chance to read the article and digest it. Too often, we vaguely point to "education" as a way to protect people from sharing toxic content or disinformation online. The fact is that educated people are human too, just as emotional as anyone else, and are not going to be in a fact-checking mindset when they see a social media post with a headline or image that makes them feel sad, angry, or disgusted. We need a platform design that lets ordinary people flag, downvote, or otherwise participate in the fight against emotionally manipulative content.
We need to improve the moderation process so that people trust moderators more, can understand their decisions more, and can contest their decisions if necessary. I think that when a moderator removes a post, it shouldn't disappear. It should be replaced with a list of the rules that were violated for everyone to see. And moderators should be able to hide a post temporarily to give the author a chance to edit it and resubmit it. Also, the first time someone posts in a forum, when they write a comment, they should see sitewide rules and community rules right before they submit, giving them a chance to go back and revise their post. (This mainly applies to a place like Reddit, but I think a platform like Facebook would also be better off with community rules.) These changes would make it so that people understand moderation more, can see a more transparent process, will stop being surprised by moderation, and will overall have a more positive experience with moderators.
On Facebook, Twitter, and YouTube, I believe some of the problem is that the platform encourages you to follow people. When you follow personalities instead of topics, it introduces a lot of potential toxicity and content moderation problems. There's no bechmark for civility, nothing you're encouraged to talk about, just opportunities and rewards for creating drama. If these platforms made it harder to make everything oriented around personal brands, and easier to engage in communities that at least have a stated purpose, that would probably help to discourage toxicity, because in a community with a stated purpose, it's easier for a community member to say, "That type of content doesn't belong here, this community is supposed to be about x." Where x is a fairly uncontroversial topic.