Not OP, but the theory was interesting to me. Allowing the engagement verbs on content like this is how Facebook measures the quality of a piece of content (quality as in, will this make people spend more time on Facebook.) No verbs, no metadata for FB's algo to work with, no way to tell one piece of content from the next.
Facebook users are mistaking engagement counts and Facebook's profit motive as a trust signal. A Nazi comparison post is spliced in between photos of your cousin's kids and a weekend trip your college roommate went on. If this same procession were flashing past your eyes instead of being scrolled, it would look like a scene from A Clockwork Orange.
Second, let's say you already believe something false (false because vaccine mandates are not actually like Nazi Germany) and Facebook is optimized to find posts from people who believe the same false thing and show it to you in the pursuit of time-on-site; Facebook is doing matchmaking to reinforce beliefs that are false. This is the same goal as misinformation. So even if it isn't misinformation, it shares the same harmful outcome as misinformation.
It's interesting that Facebook openly labels this stuff misinformation because there is some truth to your claim. These were just garbage opinions until they came in contact with Facebook's business model. The chemical reaction turned them into something worse. Facebook themselves calls it misinformation but it could only have become that with Facebook's help. A nice own-goal scored by the company.
Facebook users are mistaking engagement counts and Facebook's profit motive as a trust signal. A Nazi comparison post is spliced in between photos of your cousin's kids and a weekend trip your college roommate went on. If this same procession were flashing past your eyes instead of being scrolled, it would look like a scene from A Clockwork Orange.
Second, let's say you already believe something false (false because vaccine mandates are not actually like Nazi Germany) and Facebook is optimized to find posts from people who believe the same false thing and show it to you in the pursuit of time-on-site; Facebook is doing matchmaking to reinforce beliefs that are false. This is the same goal as misinformation. So even if it isn't misinformation, it shares the same harmful outcome as misinformation.
It's interesting that Facebook openly labels this stuff misinformation because there is some truth to your claim. These were just garbage opinions until they came in contact with Facebook's business model. The chemical reaction turned them into something worse. Facebook themselves calls it misinformation but it could only have become that with Facebook's help. A nice own-goal scored by the company.