Hacker News new | past | comments | ask | show | jobs | submit login

This positions google as an intermediary.

Someone else has figured out what is misinformation and is telling google.




The idea that _anyone_ can reliably enough decree what's misinformation is ludicrous. I acknowledge that there are vast masses of incredibly stupid people that need to pretend that Truth is handed down on clay tablets by God in order to function. We're much better off with the scientific establishment wearing this mantle than, say, religious institutions. And its pointless to try to convince these people that science is an iterative, incremental process that's based in skepticism, not certainty.

But the minority of society that understands and participates in the process of truth-formation (including scientists!) produces a widely disproportionate amount of epistemic value, and society depends on this process for basic functioning.

It's amazing to me that this isn't clear to everyone after the pandemic, of all things. The amount of claims that were banned from social media as "misinformation" that became expert consensus a couple of months later is mind-boggling. Following smart and quantitative people on Twitter was wayyyy more likely to provide you a healthy and safe pandemic experience than following the incoherent and self-contradictory public health recommendations (let alone policy). More important than this "direct-to-consumer" ability to discuss the pandemic is that experts themselves form their opinions through this type of discussion. The notion that there's a "someone else" who has reliably figured out which dissent is out of bounds is laughable.

I'll note again that Google's current policy is limited to fairly simple things, but it's an important Schelling fence being torn down and worthy of commenting on (and pushing back against, if yiu believe the trend is harmful).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: