Facebook decides what to show people. They could show you your friends posts in chronological order, and/or let people have control over what they see.
But no, Facebook decides what people see. Therefore they have some responsibility for the spread of misinformation.
It doesn't get enough attention? The "algorithms" are all anyone talks about when it comes to this issue. I think people put way too much weight on them.
Once you have enough people participating in a single conversation, from all walks of life, the discourse will go to shit very quickly.
Look at Reddit as an example. Small subreddits that are focused on a specific hobby are usually pleasant and informative. Large subreddits that cater to a large audience are no better than the comment section on a political YouTube video.
And people decide to use Facebook. I am not trying to defend it, but blaming it 100% on Facebook is not fair. Even if their algorithms were perfect to amplify misinformation, there still needs to be enough people reading and sharing content for it to have an effect.
A solution could be paying for Facebook, where both the number of people and incentives would change.
The problem is that humans are never 100% rational. If the audience for Facebook was purely rational robots, then sure, you could make the argument that since they should just be able to stop caring about these problematic things, and the issue will go away, it's not Facebook's fault that they haven't done that.
But given we are talking about humans, once Facebook has spent considerable time and money studying human behavior and society in general, exactly in order to figure out how to maximize their own goals over anything else, I think they should take the blame when there are negative side effects to doing so. Saying "well if society just stopped caring about (e.g.) negative content it'd be fine, so it's society's fault" is misdirection at best and ignores both the concentrated effort Facebook has spent on achieving this outcome, as well as the hoops they've spent the past few years jumping through to defend their choices once people started calling them out on it.
This is why I suggested 'paying for Facebook'. Legislation could exist that simply says things that have commercial interest behind them can not be given away for free.
Even a price of $0.01, would radically change the environment on Facebook.
I seriously think that selling people’s privacy is the lowest common denominator of business models. It requires no effort of said business to sell people’s data. You can do it for almost every type of business. Hotels, coffee shops, accounting firms, architecture firms, etc.
I don’t choose to use WhatsApp but I have to because that what my family members use and they aren’t tech savvy enough to use anything else. So no, it’s not a simple choice. Once a product gets saturated in the market, it gets very difficult to replace it.
Facebook doesn't really decide what you see, but instead optimizes what you see to maximize your engagement. If you never engage with political content or misinformation, you generally won't see it. Once you start engaging, it will drown out everything. What they could provide is a "no politics" option, but I wonder if anyone would utilize it. There was an old saying in the personalized recommendations world along the lines of "if you want to figure out what someone wants to read, don't ask them because they will lie." For instance, if you ask people what kinds of news they want they will invariably check "science" but in fact they will mostly read celebrity gossip and sports.
Facebook decides what you see. That they have created an algorithm that "maximizes engagement" is just another way of saying that they've decided what you should see is what they believe will keep you most "engaged". They could choose to use a different metric -- it is entirely their choice.
Facebook as experimented with a number of different options to clean your feed but ultimately they never get deployed because they all decrease engagement.
Facebook decides what to show people. They could show you your friends posts in chronological order, and/or let people have control over what they see.
But no, Facebook decides what people see. Therefore they have some responsibility for the spread of misinformation.