Hacker News new | past | comments | ask | show | jobs | submit login
Facebook thrives on criticism of “disinformation” (doctorow.medium.com)
198 points by throwawaysea on Oct 1, 2021 | hide | past | favorite | 119 comments



I think Doctorow is largely right about how this uncritical acceptance of tech "magic powers" actually strengthens them, the 'rather evil than incompetent' argument works. As he alludes to in the piece Karp and Thiel used to actively push for PR that presents Palantir as this sort of omniponent surveillance super-tool because it actually makes it seem more intimidating.

Criticism of 'precrime' tech or robo judges should be directed towards how goddamn stupid it is to direct machine learning algorithms that were made for perception at tasks that involve complex cognitive and ethical human judgements.

Cambridge Analytica was a good example of this as well. The press fell over themselves to characterise their technology as some sort of election tipping, mind controlling super tool. They probably loved it. In reality there's not even any scientific backing for their targeting and it most likely did barely anything, they just wanted to look like James Bond's Spectre.

edit: another thing the post made me think of is how much the tech sector loves Yuval Noah Harari. He once said he's surprised by it but I'm absolutely not, because he basically constantly tells them how omnipotent and all-powerful they are.

There's actually a fantastic scene in The Young Pope about this (a satirical show about the internal workings of the Catholic Church)

> Do you know how many books have been written about me?

  > Seventeen.
> Eighteen. The last one's going to press next week, and it's got the best title of all.

  > Which is? 
>The Man Behind the Scenes. I suggested it myself.

  > Who wrote it?
> Manna, that leftist reporter.

  > That means it'going to be critical of you, Your Eminence.
> Of course, those are the best. They turn you into a legend.


Dude you misunderstand.

So does Mr. Doctrow.

Everyone who has shelled out cash for marketing and advertising knows its crap. People who work in adtech know its crap. Politicians know. CEOs know. They pay the bills.

BUT its better than the previous crap and all other alternatives thanks to more data. Which is why anyone who want to influence, persuade or control others sits 24*7 on social media.

When it comes to the Game of Persuasion if everyone is using crap tools who ever uses it more has an advantage. Momentarily atleast untill someone else has more energy or outspends you.

Attacking the crapiness of the tool wont reduce disinformation.

Billions have been spent on reducing the crapiness and its unsolvable, because you never reduce the number of mindlessly ambitious chimps, who will use whatever crap tools exist, to gain influence, persuade and control others.


Right, the point is you have to be in the game. If you were to pull out because the game is crap and ceded it to your opposition, it wouldn't be crap anymore. They'd dominate the channel and own the signal. Better to pay your money, keep in the game and spoil it for everybody than let anyone else win.

Also Corey correctly identifies, but dismisses the real power of FB and similar tech, which is connecting people with stuff they're interested in.

"Seen in that light, “online radicalization” stops looking like the result of mind control, instead showing itself to be a kind of homecoming — finding the people who share your interests, a common online experience we can all relate to."

We know this. Algorithmically tuning your feed into torrents of stuff you already like is the issue. He is right that yes we can relate to this. The reason we use search is to find stuff we know we want more of, but being dumb chimps the uncomfortable fact is we don't always know what's good for us. Hence Oxycontin, it feels great until you depend on it to feel anything at all, and then you're dead. Or slagging off humans with vaginas for being TERFs. Or storming the capitol building chanting "shoot him with his own gun".

The problem then is, who gets to decide what's good for us? Maybe it's our right to die or become brainwashed however we choose. I don't know, I've got no easy answers for you here.


> Cambridge Analytica was a good example of this as well. The press fell over themselves to characterise their technology as some sort of election tipping, mind controlling super tool. They probably loved it. In reality there's not even any scientific backing for their targeting and it most likely did barely anything, they just wanted to look like James Bond's Spectre.

Reading what’s still out there about Cambridge Analytica ignores the findings of the UK’s Information Commissioner which indeed found they were basically throwing data at scikit-learn hoping for a result — the script kiddie version of machine learning.

How many of the journalists gone back to correct their misinformation on the topic? Many of the articles are still out there in their original form and those hacks are still making hay on the subject.


== How many of the journalists gone back to correct their misinformation on the topic?==

What was their misinformation? It seems like they were right about Cambridge Analytica’s goals and tactics. You are claiming their technology wasn’t great, but that doesn’t change how they obtained it or what capabilities they were selling.


Remember, there's also a powerful tendency to absolve people of the consequences of their actions by painting them as victims of powerful corporate mind-control.

If you eat fast food for every meal, is your obesity your fault, or are you a victim of food corporations' incredibly effective advertising?

If you smoke a pack a day, despite all the public information that smoking is harmful, is that your fault or the fault of the tobacco industry's advertising?

If Doctorow is right, and advertising is ineffective, there's a lot of people who will have to accept that their situation is a product of their own actions. For a lot of people, this is "victim-blaming" and unacceptable. It's a lot easier to have large corporations being evil powerful baddies who control us and therefore absolve us of responsibility for our actions.


There's no way to have any power in a situation without taking the perspective that your actions shape your situation though.


Yes, and that lack of power is a feature not a bug for some people.


Kind of off topic, but that show (The Young Pope Season 1) is very very well written, and is a good commentary on how to understand culture and realize how keeping things mysterious and hidden is a way to seed obsession. This scene[0] in particular is a good one.

[0]https://www.youtube.com/watch?v=hY8C3cIMR4o


They say to never attribute to malice what can easily be attributed to incompetence or stupidity. I think the extreme version of this is when emergence through stupidity appears to be a global conspiracy of super rich/geniuses. Humans do love stories and legends. They distract us from the truth which is that the world is exceedingly mundane.


You speak of Hanlon's Razor. There is a corollary called Grey's Law: "Any sufficiently advanced incompetence is indistinguishable from malice"


Contrast with Hanlon's Handgun though: "Never attribute to stupidity that which can be adequately explained by systemic incentives promoting malice."

https://news.ycombinator.com/item?id=21691282


Yes, I totally agree.

I guess it's human nature to prefer attacking on a moral angle as opposed to a pragmatic one (even though the latter is usually more effective).

We see this all the time even outside the tech sector, people will cry "omg you're such a racist" sooner than "you don't even understand numbers, you bozo".


I think it's clearly counterproductive to shout racist at everything; however, people who don't understand the numbers don't (generally) use numbers to define their stances. They choose what to believe in based on other beliefs, and they justify them with funny math.

The thing I like about your insult vs "you're racist" is that you're attacking what someone is doing, rather than what they are. When confronting someone for being wrong, it's a lot more constructive to say, 'Hey, you're wrong about this, and this is why,' than it is to say, 'You're a moron (or a bad person), and this is why.' It's also more accurate. Labeling is helpful for navigating thousands of people in our communities/millions online, but keep those labels to yourself, and be ready to question them. During confrontation, though, it will cause the other side to shut down, and that's where we are. We're more concerned with defining each other as others than working together to find commonalities and building on points of agreement.

I'd just add that significant power imbalance significantly changes things and how we ought to attack disagreement.


Yes, the point I think you're getting at is how many bad actors will use the aesthetics of rationality without the substance of it, and that's equally irrational. Like the screens of fake code from B-movie hacking scenes, they look like they mean something, but there's nothing behind the surface.

However, in such cases I think it's actually even more harmful to resort to those accusations as a first resort. Those arguments are usually wrong in at least two ways: wrong as in "incoherent, logically flawed", and wrong as in "defending reprehensible behavior".

Failing to point out the logical flaws can lead others to believe "he's unlikable but he's right", another type of posture that these people just adore acting out.

There absolutely exist genuinely racist, sexist, etc. people, and one should definitely call them out when appropriate. But those terms are very susceptible to "cry wolf" type of scenarios. If everything is the worst thing ever, then nothing is. Being more cautious with how one plays those cards allows them to be more effective when needed.

I completely agree with you about labeling, I couldn't have said it better myself.


Aside from how absolutely sad the humorlessness of society these days is, it's also a catastrophe that we've taken humor out of the critical tool belt. H.L. Mencken has a great quote about it, something along the lines of the most good being done in the history of man by people throwing cats into temples.


the criticism of C.A. do you have any further reading? i recall a podcast where they interviewed one of the founders ( his name was something comically evil like last name Spectre) and he was saying it was just consulting mumbojumbo and it didnt work. but now I cant find it.


I think Cory misses a really big point here. He seems to assume that the source of disinformation is ads, such as political advertisement. This implies that advertising works if and only if disinformation works.

Of course, in reality disinformation goes through many channels on facebook, but (my understanding is) a ton of it is organically shared and commented material and links. I.e. a different kind of content from ads altogether.So facebook ads could still be ineffective, yet disinformation campaigns highly effective.

I'd also discount the studies on fake news from Spring 2017. The game has changed significantly since then.

My impression is a bit different from "criti-hype": I think Facebook is likely extremely good at promoting engagement, pulling attention, and changing minds -- HOWEVER -- FB has very little control over what content is able to be successful in this environment they've created. It takes an entire web ecosystem (quasi-news sites to host 'articles', etc etc) to conduct these disinformation campaigns and they have to resonate with people organically interacting, not just popping up in ads.

The fact that FB can't at-will divert all this attention onto breakfast cereal or brake fluid or whatever doesn't mean the disinformation is ineffective in the political sphere.

Edit: it's one thing to build an extremely fertile petri dish, it's another to control which organisms end up colonizing it.


Is there an appreciable difference between paid channels for disinformation and non-paid? Just yesterday on this forum it was discussed how 19 of the 20 largest Christian pages were Russian troll farms [0]. You don't have to be paying Facebook to craft a public narrative. Admittedly, it does take some capital investment creating the content farm: a small army of modestly paid office workers abroad.

[0]: https://news.ycombinator.com/item?id=28691585


>...were Russian troll farms

From that article

>These groups, based largely in Kosovo and Macedonia


Sure, the article doesn't say who is funding them, and they aren't "based in" Russia. But looking at the big picture, is the war-torn semi-recognized developing eastern European country of Kosovo really investing its limited resources into funding large-scale efforts to destabilize Western culture and discourse? To what purpose? Isn't it much more plausible that there's a superpower involved somewhere? Isn't it extremely likely to be the superpower we know already does this?

Sometimes big true-true different from small true-true.


> Sometimes big true-true different from small true-true.

More often, people who lie to you about the small things are also lying to you about the big things.

One might even nurse a prejudice against accepting the word of sources who lie consistently, casually and without need, suspecting that "true" actually means something different to them. Taking the more charitable interpretation.


>To what purpose?

Presumably to make money somehow. Seems far more likely than your vast conspiracy of attempts to destabilize Western Christianity through shitposting.


While I partially agree with there being a difference between just ads and what we're seeing happen, I didn't read it as being a 1-to-1 correspondence. Instead, it was pointing out the perhaps limited influence of...influence. In that maybe this isn't really "changing" people but highlighting them, giving them a platform, letting them congregate. Surely it attracts people that may not believe something yet and are susceptible, but the argument seems to be more of amplification and noticing this, rather than creating it as much. In other words, remove FB and you'll still have the same beliefs, just not as obvious to the outside. At least that was how I read some of the article. (I do think FB and the like are making things worse, but does it just look like that because we can see it better? or have the numbers grown rather than just each gotten more bold?)


The political split we mostly see are that of partisan reactionaries. Not the disinformation itself is powerful, it is that it is seen as manipulating millions of people and some groups argue to remove that content. I think this is the fitting anology for the article. The act of censorship gives misinformation its power. Not because people truly believe them.

Nobody did take Alex Jones selling man pills seriously. Banning him turned him into a martyr and confirms his theories that he is enemy of the state nr. 1. A self fulfilling prophecy.


People took Alex Jones seriously enough to harass the parents of children who were killed in school shootings. It's that kind of behavior that got him banned in the first place.


There are many instances of internet mobs come down on people. I think explicitly teaching that people should not believe everything they see on the net is still the best mitigation. I know that it is naive to believe that it would solve the problem.

For people affected the best advice is to not engage. Only then it quickly becomes boring for the attackers and they will quickly be enraged by something else.

Sucks, but I believe this is an inevitable consequence of networking everyone to each other. The old tip to not expose to much information publicly helps here too. Public personas have to handle it like in meat space, if you become too popular or infamous, you have to think about PR management. Sucks too, but I don't think there are too many alternatives.

I still think this is mainly on those that actually do send the threats instead of the instigators. Even if you think of his fans as mindless drones, they do have a responsibility for themselves.


The political divide is because the institutions of power are winning.

It will be and always has been the goal of those with power to deflect attacks directed at them, usually to other groups who are attacking them. For all of his faults, Biden has been extremely good at convincing the lefties that "well, we'd be back to normal now if those damn Republicans would just get vaccinated", effectively turning the normal folks on the left against the normal folks on the right instead of where the anger should be directed: toward the institutions that let millions of folks go homeless, financially and medically insecure, etc. in the face of the pandemic.

The Republicans in power are, I think, a little better at playing this game: they whip their base into a furor over the group in power but only in such a way as they (those already with power) still come out on top. The people are angry at the right group, but for the wrong reasons (and really just to further the agenda of the _worst_ people).


The converse observation also holds - if people didn't take him seriously, he wouldn't have been important enough to ban.

The fact is that people do take him seriously, and it's a huge problem. You can ignore the problem, but the problem won't ignore you.


Indeed most of disinformation in Indonesia is via contents not facebook ads.

But there's still problem with Facebook - echo chamber algorithm, it traps people in their own bubble world

I once tried to like just funny posts. In just a week, nearly all contents in my timeline are jokes and funny posts

Now that's scary.


One of the biggest and most common attacks against Facebook is that its algorithms are what drive misniformation so much on its platform; if those algorithms are so powerful, then why would Facebook's ads algorithm be so much worse?


Yeah, that’s the question. My attempt at an answer is that they can set up a powerful environment for disinformation to thrive and influence people, it doesn’t mean they can use it to push any random product with the same level of success.


If there isn't a "lying flat" movement for social media, there sure should be one. It characterizes my feelings perfectly: I give up on it. Nearing 1 year without Facebook, I can confirm it isn't essential and your strongest network connections will contact you if they wish to.

If Facebook could exist as it did one decade ago, maybe it'd be different.


It’s been over 7 years since I deleted my FB and I’ve never had Instagram. Their IP address scopes are blocked in my home. I do miss the now ancient FB, before they destroyed it with algos controlling what I got to see. That is when I got rid of it. Every so often someone will show me some hatred spewing diatribe someone in their family posted, and all the people piling on and I am happy being unaware. I get my quasi-news, conspiracy updates, and assurances that the world is about to end direct from Zerohedge.


I can barely remember the old, authentic social media at this point. That legacy has been completely drowned out by how dark and angry the world has been made by a few monopolistic tech companies squeezing out market share, engagement, and profit at all costs.


It was beautiful. Chronological so you can see everything new with a quick scroll and not miss anything, and populated only with content posted by people you actually follow which in those days were only people you knew from real life. No influencers, no celebrities, no talking heads. Then came the brands and it all changed.


Authentic? I remember a lot of bragging and showboating and low quality (as in tabloid newspaper) content. It seemed to bring out the worst in people from pretty early on.


> a lot of bragging and showboating and low quality content

So basically a normal conversation between typical adults?


I guess expectations are (were) higher for written words :/


> I do miss the now ancient FB

Do you remember when the bottom had a quote "too close for missiles, I'm switching to guns"? My friends and I wondered what that ment for a long time.


Was it Facebook? I always remembered OkCupid as having that quote and a bunch of other 80s movie quotes.

Aside from being from Top Gun, what it refers to is a principle of fire control of being "danger close," meaning you or other friendlies are within the collateral damage radius of a weapon. When one fighter jet is close enough to an enemy that blowing it out of the sky might also blow themselves out of the sky, they switch to a less destructive weapon with a smaller collateral damage radius.

It's sort of interesting to think that something like this meant anything except early web 2.0 pioneers were largely xennials raised on Top Gun, Terminator, and John Hughes movies.


It's a quote from the movie "Top Gun" if that helps.


> I do miss the now ancient FB

I miss the "Random Play" interested in field. The modern incarnation is Facebook Dating which requires the mobile app, so I'll never use it. I also suspect they use the attributes to target ads (e.g. shirts for tall men if you fill out the height field).


+1 for the zerohedge comment.

A few months ago something happened over at Zerohedge.. The site became unviewable with JavaScript disabled so I stopped going to the site. But it looks like they’ve had a change of heart and can now be viewed with JavaScript disabled.

But it also seems like the content has changed from what it used to be.

Did they change ownership or is it just my imagination from taking a break from their site when it became dependent upon JavaScript?


I can't tell if you guys are being serious, Zerohedge is Infowars with a cnbc angle.


Yeah also deleted my account a long time ago and blocked their IPs. Now I just need to get rid of 90% of my code, which uses pytorch, react and react native...


100% agree. Leave Facebook and social. IRC is much better. Frankly, context is the issue. My group of friends should not intersect with some other random group of friends. Content should remain within its context. Sharing to everyone is the worst thing possible. It just causes huge conflicts from different contextless people clashing over content the original poster usually considers harmless. Like, if anti-vaxers stayed in their own little groups, it might not be such a problem, but their content gets posted to everyone in everyone's networks, and the meme virus spreads, lacking the important "the people saying these things are fucking idiots" context.


This is pretty much what Google+ had/ failed due to.

If only relevant content is shared within a 'circle', people appreciate it, sometimes ignore it, and move on. Consistent controversy and drama is avoided, which just so happens to be a key driver of social media engagement.


> IRC is much better.

IRC is too technical for many people I know, but they are willing to use, or are already using Discord.


The only problem with Discord is that it is closed. Otherwise, it's a truly excellent platform.

In many ways, the problem with Social Networks isn't algorithms or whatever, it's the scale, and being on a small server with at most 50-100 other people is an incredible experience. I've met so many interesting people from around the world who wouldn't have the technical skills to navigate the IRC world, and I didn't have to blow up my brain in the process.


The problem with discord isn't that it's closed, it's that you don't own your own data.

The difference between open and closed is meaningless for the cloud. Give me the raw storage used by any piece of software and I can knock out a script that will serve it. Give me any open source network and without the data you have a ghost town no one will use.


I said closed, not closed source. Closed in the exact sense you are talking about: you don't own your data, you can't transfer in or out with ease, and you can't bring your own user agent without breaking the ToS.

https://en.wikipedia.org/wiki/Closed_platform


IRL is even better :)


Ive quit FB about 10 years ago but Im considering coming back because I keep hearing about the value of Groups. Sure its just mailing groups with UI, but thats where the people seem to be.


The concept of disinformation as it's used today, and Doctorow touches on this in his essay, is itself infected with bias. We all can point to true and correct information that was at one point or another labeled as disinformation when it actually only was inconvenient to one or more powerful actors.

Personally speaking, the label has come to mean to me something other than "false information, intentionally spread". To me it has come to mean "something that could be false, but someone definitely doesn't want it talked about".


My understanding is that disinformation is information that is deliberately spread that is known to be false to further some agenda. Misinformation is where false information is spread without that intent.

From this definition, what’s interesting is that you can only identify disinformation by knowing 1) what is actually true and 2) the motivations and methods of the person spreading it.

I can think of two good examples of disinformation:

1. When oil companies denied climate change despite their internal research

2. When tobacco companies denied the importance of nicotine in their products


It's more then that. It's not whether information is "true" or "false". What matters is the "falsifiability" of statements and assertions shared online in the first place:

> In the philosophy of science, a theory is falsifiable (or refutable) if it is contradicted by an observation that is logically possible, i.e., expressible in the language of the theory, and this language has a conventional empirical interpretation.[A] Thus there must exist a state of affairs, a potential falsifier, that obtains or not and can be used as a scientific evidence against the theory, in particular, it must be observable with existing technologies.

https://en.wikipedia.org/wiki/Falsifiability

A lot of disinformation boils down to making a claim that can't readily be falsified while defending their stance by using logical fallacies to deflect criticism, such as shifting the burden of proof to the other party.

In the past 1.5 year, a lot of claims made about the pandemic simply couldn't be falsified for lack of technology or methods to observe a potential falsifier. That doesn't means those claims are readily false or true. It means they are plausible or probable. Which are very different qualifiers regarding veracity. A typical example is the heated debate about the efficacy of masks and mask mandates.

Access to information has also demonstrated a down side. Research studies have been widely quoted / cited within the public debate to support all manner or arguments. However, by pulling studies out of context, they lose their value. Many tried to support their claims by cherry picking, ignoring scientific publication process (e.g. peer review) and so on. The net result is scientific research, regardless of it's veracity, re-purposed as token support for confirmation bias in the public debate via thinly veiled appeal to authority.

The relevant questions regarding disinformation isn't the veracity of statements. But rather: Who are you anyway? Where are you coming from? And why are you making these claims in the first place?


3. When pharmaceutical companies denied the risk of novel gene therapies for inoculation against a family of cold viruses.


4. Ditto for the risks of OxyContin: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2622774/


Unfortunately, the most strident voices saying that any term is overused are usually those to whom even strict definitions apply.

"I'm not anti-sneetch, I just..."

Assuming that people use a term because of their own agenda is not helpful IMO. Maybe they actually do believe the information was false, and that it was spread deliberately. Why not give them the benefit of the doubt that you insist on yourself?


I would go further and say it is the exploitation of emotion to whip people into taking opposite stances (or a wanted stance) and thus hindering any dialectic or reconciliation process. We all can see how this campaign is a 24h show so there is never any respite to cool down and take a breather. Disinformation is a tool to those nation states or political parties who have the power to wield it.


This video will make you angry: https://www.youtube.com/watch?v=rE3j_RHkqJc


Absolutely. I find it horrific how, due to politics, we've moved all the way from "censorship is bad" to "we need to stop disinformation" which clearly implies soft or hard censorship. It also heavily leans to whatever political biases the company has and people think that's perfectly fine because they're getting one over their political opponents.

The end result is that lies become truth and truth becomes lies because dogma spread by tech companies (lobbied by other companies or political groups) is what is pushed and accepted while dissent is punished in varying degrees.

All in the name of "the greater good", non ironically. Seems we learned nothing from fiction or history.


It has never been "censorship is bad"; it has always been "government censorship is bad"

Never has a private newspaper been required to publish every comment or opinion that they receive despite its inaccuracy or obscenity.


Warnings presented alongside content arent censorship.


You say this as if there isn't widespread confirmed censorship AS WELL, for example, https://www.theguardian.com/technology/2021/may/27/facebook-...

That's FB censoring discussion of the origins of a pandemic responsible for millions of deaths, and even banning people who posted about the possibility.


I say that as if "we need to stop disinformation" doesnt necessarily mean "we need censorship". Which it doesnt.

I reckon conflating the two actually makes censorship politically easier by implicitly endorsing the view that censorship is the only effective way to combat disinformation at its source.


There is a lot of scope creep in how this has evolved over the past five years. Early in 2016 when "fake news" first started being widely worried about, the Washington Post did an expose interview of two guys in Long Beach, CA who had created a series of fake newspapers, not staffed by any real reporters or writers, where they wrote all the articles themselves under fake names, and just completely made up both the headlines and the content based on what was generating the most clicks. It largely ended up as political content because it happened to be an election year and that sold, but it was strictly made up. They made no attempt at all to even care about real events. This was very unambiguously fake news, and it wasn't uncommon for sites like this to pop up out of nowhere because it was cheap money for unscrupulous types to just feed the outrage machine and profit.

But two things happened. First, something that explicitly fake is actually pretty easy to identify in an automated fashion and thus easy to root out once you know you're looking for it. Two, Trump himself took the word after it became apparent that much of the suckers being suckered into clicking on this crap were his strongest supporters, and completely poisoned the well by calling anything and everything he didn't like "fake news" as if an actual professional reporting operation at least attempting to do real investigations and interviewing real people was equivalent to two guys in an apartment making up headlines completely from their own imaginations.

Disinformation campaigns are somewhat orthogonal to that and a thornier issue. We're conflating many things. Some are just larger troll farms that are exactly equivalent to the two guys in an apartment, but operating with professional backing and financing in a more sophisticated manner. But they're just trying to make money. Some of what they say might actually be true. They don't care and aren't trying to push any particular agenda. But they're also not trying to produce true information and do so only incidentally if they do at all. At least some of it is actual foreign intelligence services conducting psyop campaigns, which definitely do care and are trying to push a particular agenda, but it's just to sow internal discord. They don't really care about one side or another of a contentious debate and will gladly inflame the worst actors of both sides. But these are being conflated with likely sincere people and organizations who are doing some level of real investigation and actually believe what they're writing, whether or not it's accurate. These are all umbrella'd under the "disinformation" term, but they're not the same thing.

Having Covid happen destroyed all hope of cleaning this up, too. We're now in the middle of an important global event affecting nearly everyone, but about which there are little in the way of facts, and we certainly can't know with certainty what an optimal course of action as a response was or should have been. We'll never be able to observe counterfactual worlds. Many important facts will only become known years in the future, and some will never become known at all.


E: The tone and direction of this comment would've redirected from discussion of an otherwise interesting piece, so I've changed it.

You can also read this on Cory's own website: https://pluralistic.net/2021/09/30/dont-believe-the-criti-hy...

or on archive.is: https://archive.is/iAsXZ

(My apologies to people who responded to the earlier version of this)


Thanks for the reminder; I forgot he has the same posts available at pluralistic. Unfortunately I don’t see a way for me to edit the link.

I agree Medium is not a great platform. Dark patterns aside, I was also disappointed to see their heavy censorship early in the pandemic, which kept many interesting analyses and speculation from being shared.


Does anyone know who runs archive.is? I imagine they’ve got some good data on news readership.


Search engine cache, another way to avoid medium.com website

https://webcache.googleusercontent.com/search?q=cache:https:...


I believe that Doctorow is missing the most important issue here.

The elephant in the room is the privatization of your social space and that Facebook is a private spy agency. That is what makes FB extremely dangerous.

The fact that FB is private, efficient and full of technically brilliant people is not good for the rest of society, because the society can not imagine what those brilliant guys could do with their data.

For example, I go to a party with friends. I don't have facebook, but one of my friends have it. He takes a picture of the group and posts it on FB.

Now the fact that I went to the party, the place of the party, the time, the people that went to the party, the private opinions of some friends about other friends, girl-boyfriends or lovers(if they used FB channels like wassap to communicate) has been recorded for the rest of my life and can be accessed by American spy agencies like the NSA.

Society expects FB will forget because they as humans do forget, but machines never forget.

People talk about a crush on someone else with a friend and expects nobody listening(on wassap), but a bug is always on, recording everything they write or say.

FB is like Stasi, only way more efficient and that applies to the entire world population instead of just one country.

And they are constantly pushing the boundaries of privacy like using "smart glasses" so espionage is full in your life.

And they are constantly improving their AI techniques so even when they could technically not do something today does not mean they could not do in five years with the data that they already collected about you.


Engagement or enragement, they still get you.

I get the feeling that visitors on Facebook are often angry and annoyed, and Facebook is fine with that.


I always get a small chuckle when I read an article about how horrible FB is. Not because it isn't horrible. It literally is the digital devil. But because 90% of those articles have a FB share button on them. As this one does.


I think the topline, that Facebook would rather be seen as competently evil than incompetent, is dead on.

But I think that, in his larger arc, Vinsel's epistemology isn't correct. Early optimistic predictions about emerging technology are just that - early and optimistic. There are always extreme early ideas about new tech. Just go google up peoples' predictions (and fears) from 2015 about how VR would transform things. The thing that makes them dangerous is when they show potential for being correct society warps around them. We start spending billions of dollars on self driving cars in 2015 with valuations based on them being delivered in 2020.

But what is the alternative? Vinsel wants us to have "sound information" - but making predictions that end up being wrong is part of checking that something is sound! The future is uncertain. It's true that nothing is ever as big as people think - sometimes it's bigger - but mostly it's much smaller. That guy who wrote the report for McKinsey (a group one should never trust in general) was really hyped to get headlines or really hyped on AI and because he's good at his job you can't tell which it is. The article is clearly nonsense, but it has credibility because it exists on the far side of a sea of uncertainty about something. It's fantasy to imagine we can "know" what something will be and this process seems unfortunately human and unavoidable.

I also think there are a lot of people doing good work in STS on these issues. John Cheney-Lippold, Natasha Dow Schüll, Virginia Eubanks, and Adrian Mackenzie are some of my favorites. I'm excited to read Your Computer Is on Fire but I haven't gotten to it yet. Heck, I think 90% of the bad tech hype is covered the old classic Leviathan and the Air-Pump.


I keep getting shown very long and well produced videos of very complicated steak recipes where they end up horribly overcooking it or drowning it in conflicting flavors. Someone made a stew out of what looked like a $60 ribeye.

My only conclusion is that their purpose is to draw hateful comments for “engagement”.


Disinformation is a problem for all social media platforms, for all languages, FB is one of those platforms. I have the feeling that media focuses more on FB & English, but we should rather focus on the problem as whole and its harm for all communities around the globe.


Thanks for sharing this, found it a very interesting read and analysis of FB and disinformation/advertising. One clue should definitely be that "if we just fix FB the problem will be solved or substantially better" can't be as true as people want. It would help, but it is not the real source of the problem. An amplifier that could (not so easily?) be replaced.


Facebook thrives when talking about Facebook so stop talking about it


TIL "criti-hype", a useful label.


The problem is nobody seems to hit Facebook where it hurts, their advertisers.

Why? Advertiser pressure combined with the losses Facebook is taking due to Apple's policies would certainly foster some change?


You think the advertisers want change? We are talking about companies that knew about climate change since the last 40 years, that shifted production to China despite the lack of human rights and environmental laws. Correct, not in spite of but because of.

These companies pull same ads to pretend to be the good guys who cares but a soon the storm is over they are back again.


Even if the tech doesn't create the conspiracy theorists, it just allows them to "hatch" in a welcoming environment, it is still far from useless. Had those people not found like-minded company through Facebook, they wouldn't have manifested their inner conspiracy theorist, and in turn could have voted differently if the doubts within hadn't cracked the shell of common sense built around them.


I agree but I wonder what the solution is. "finding like-minded company" is basically a high level description of what social media is at its core, hence why you find the same conspiracy mindset in so many locations and forms (sometimes ones you wouldn't expect, like /r/superstonk or twitter circles devoted to public transit / public policy on transit)


Channeling an old po-mo saw, I've commented before to the effect that the social platforms picked "disinformation" as an enemy to persuade you that the rest was real.

To be a conspiracy theorist just means to not be aligned to the dominant narrative, and since most intelligent people gravitate toward coherence, when the dominant narrative becomes absurd, you get theories and explanations from the people repelled by it. They usually suck, but when deception prevails, conspiracy theories become the more coherent option and they have better predictive power than the narrative itself. Today, it is objectively more plausible that a cadre of leaders is co-ordinating to exploit a crisis to drive a radical agenda using arbitrary compliance as a signal and lever than it is that paper masks we can smell farts through are necessary to prevent the spread of disease. Gravitating to the former is the direct result of being repelled by the incoherence of the latter. To sustain the dissonance of the absurdities of power, it requires education and practice, because it does not come naturally to anyone. If you want to reduce conspiracy theories, present a coherent narrative that doesn't require sustaining the dissonance of absurdities.

I like Doctorow's writing, but if he isn't banned, I'm pretty sure FB and other platforms have nothing to fear from him.


> To be a conspiracy theorist just means to not be aligned to the dominant narrative

I don't believe that. One can be skeptical of, or even opposed to, the dominant narrative without becoming a conspiracy theorist. The key is in the word itself: conspiracy. It means people operating together toward some hidden and generally nefarious goal. That's not only attribution of motive (a fallacy) but also positing something that's often quite implausible. How many people would have to retain perfect secrecy for how long to support the "fake moon landing" conspiracy, and how likely is that really?

Even if we decide that "conspiracy" is being misused and focus on "theory" instead, that theory is also based on assumptions and coincidences that are as "absurd" (your word) as the dominant narrative. For example, you're assuming that the leaders "exploiting a crisis" to drive a "radical agenda" outnumber those interested in seeking the truth and/or serving the public. You're assuming that being able to "smell farts through" a mask means they're useless, even though the offending molecules are smaller than even the smallest virus particles and real-world results show that widespread mask wearing does reduce transmission. Your theories are no less "dissonant" than anyone else's, but even if they had some sort of internal consistency that doesn't make them correct.

Conspiracy theorists are people so desperate to be not only right but more right than others that they throw true skepticism and logic out the window.


I'd accept the definition of conspiracy requires co-ordinated intent and deception, and that divergence from dominant narratives does not require a belief in a conspiracy, but I would reassert that accusations of conspiracy theories are used exclusively to isolate and delegitimize dissent from dominant narratives.

Anyone who makes predictions based on percieved incentives and not official narratives is a conspiracy theorist. You can't just use the epithet as a synonym for inferior and outgroup and expect people to treat it as a serious argument. This qualifer that the goal of a conspiracy has to be nefarious is ridiculous, because that just means it's fine when our side does it becuase no-nefariousness means no-conspiracy.

People who think that way aren't engaged in discourse, they're at war, which is fine, but pretending you're not at war when you in fact are is called perfidy. Calling something a conspiracy theory isn't discourse.

I'm pretty sure we landed on the moon, but I'm also pretty sure that hundreds of thousands of people worked in secrecy over decades and conspired to intercept the private communications of pretty much everyone in the world, while subverting standards bodies, science, academia, and open source projects, as we found out in 2011-2013 thanks to Binney and Snowden. Conspiracies are not exceptional, they're common as muck, ask any regulated professional.

We could probably share an Ig Nobel for comparing virus cells to fecal particulates (farticles?), but even charitably, comparing any transmission reduction effect of masks to the cost of enforcement and the effects of cynical policy response still yields a co-ordinated effort to decieve the public with noble lies "for their own good." There has absolutely been a co-ordinated effort to leverage the crisis to institute radical policies, denying that is so unserious as to be gaslighting.

What we're likely haggling over is the point where facts dissolve into focus and narrative, and where shared ideology becomes conspiracy. My view is that ideology is inextricable from conspiracy, where a more materialist view would be that a conspiracy is just when you've been caught and convicted of it.


> accusations of conspiracy theories are used exclusively

Exclusively? Really? Like there are no actual crazy or malign conspiracy theories out there? Literally none?

> This qualifer that the goal of a conspiracy has to be nefarious is ridiculous

I very specifically said "generally nefarious" to indicate that nefariousness was non-essential. The essential parts (as you'll find in a dictionary) are "operating together" and "hidden" and you already conceded on those. Your original key claim ("to be a conspiracy theorist...") was clearly ridiculous, and you even seem to have admitted as much. What's the point of quibbling about non-essential and secondary parts?

> conspired to intercept the private communications

Nice red herring you've got there. Be a shame if anyone thought you were avoiding actual productive discourse by resorting to it.

> You can't just use the epithet as a synonym for inferior and outgroup and expect people to treat it as a serious argument.

Never did. Is that a strawman, or attributing motive, or both?

You've dug yourself a nice deep hole of bad faith here. Either start climbing out of it, or at least stop digging.


I think I'm being trolled! What a hoot.:) However, litigation isn't persuasive. I'm glad to have struck a nerve, because I get to read final boss level arguments that I still think reduce to backbiting, envy, and nihilism, but this fundamental and probably irreconcilable level of disagreement on principle is too meta and off topic for us to have here.

One hopes you will one day use your powers for good. See you in battle!


> You've dug yourself a nice deep hole of bad faith here. Either start climbing out of it, or at least stop digging.

This tone is working against you more than you realize.

Don't respond until you can process what they're saying without wrapping it off with such provocative emotionally-charged bait please.


A minor nitpick in an otherwise inspiring article: I feel like the argument about ad efficacy also wants to have it both ways - if advertising sucks, why do we believe that it works? Or in other words, if advertisers are good at advertising themselves, why would they be bad at advertising anything else?


There are definitely some advertisers who are not getting their money's worth out of FB ads, and likely some political advertisers are among them. But there's a lot of advertisers who are, and can track their sales to FB spend.


> If the best argument for vaccine safety and efficacy is “We used the same process and experts as pronounced judgement on Oxy” then it’s not unreasonable to be skeptical — especially if you’re still coping with the trauma of lost loved ones.

That depends on what you mean by "not unreasonable." If a device tells you a lie one time out of 50, it's a misleading device, but it's still preferable to one that lies 1 time out of 3. The process that leads from "authority sources of information are sometimes egregiously wrong" to "all sources of information are equally misleading" is the very foundation of misinformation.

Do I believe that the FDA is "the truth"? Of course not. But I still believe it more than any other source on drug safety.


I love the accusation of disinformation. When you call somebody a liar you just sound like a psycho. But when you accuse somebody of spreading disinformation you boldly declare yourself a soldier in the army of wokeness. Which is so much better.


Spreading disinformation can be done without any ill intent. The source is trying to willfully manipulate people, but the person who spreads it is just a forwarding agent.


That logic is historically popular with the goose stepping enthusiasts.

We aren't censoring/oppressing/deplatforming/infringing-upon-your-rights because we think you are evil. We are doing it because you have been fooled. Because we know better. We're doing it for your own health and the health of our society.

They love to say that shit.


Are we talking about established facts like "the earth is not flat" or gray areas like "NFTs are good"?


The point of the criticism of "disinformation" is to allow Facebook to increase their censorship. It's like the caricaturial movie scene of someone screaming "somebody stop me". Facebook want to be criticized of "disinformation" so that Facebook gets the excuse to employ censorship and "fact checkers". Those fact checkers will always conveniently focus on exactly the political issues with the most controversy. The value of political censorship is quite higher than normal ads.

They are just selling political censorship in some corrupt way (you'll never see an official transaction, but there are deals behind the scenes).


time to buy more FB shares


Facebook in 2021 is like aol 10 years ago. No one I know use it. Why is there so much noise about it?


It's the current scapegoat for 'disinformation', giving the government and all other propaganda peddlers a breather.


I wouldn't say they are a scapegoat. But they are the largest platform with the widest audience and are therefor the largest platform to disseminate disinfo.


Because their products are used by over 2 billion people a day and according to their own research it's having a negative impact on a significant chunk of them.


> according to their own research

I mean... you just read Cory's whole story about how maybe Facebook and co. are manipulating the data to make it seem like their product is better than it is, and now you cite Facebook's research saying that Facebook reaches 2 billion people?


They could be off by an order of magnitude and it would still be pretty bad.

Here are their official numbers from Q2: https://s21.q4cdn.com/399680738/files/doc_financials/2021/q2...


Have you considered the possibility that the numbers they are reporting are accurate but meaningless? What exactly is "reach", anyway?


They know more about you than your mother, I'm sure they can do a decent job counting the users that they track.


By buying data like all the other companies do? Not impressive. They don't track me directly.


Facebook pixels are on like 95% of major websites these days, so unless you block them they have a pretty good profile of your interests.

Even if you block them, they definitely have your contact info and a shadow profile for you thanks to your friends, who definitely gave facebook access to their contacts at some point. The contact graph has enough info to help them figure out where you might live, what school you went to and where you worked.


If you block them (mozilla fb container is great for this) how do they connect your phone number in your friends phone when the facebook app scrapes it to your internet browsing on a third party site? Where I might live what school I went to and where I work seems like its public information already, maybe in voter registration databases.


1. They probably still track your 3rd party app usage, which is not as easy to block as on the web.

2. Since they have detailed information about your friends the contact graph has enough information to predict your age, probably what your major was, political leanings, etc. Not to mention photos of you that might have been posted by your friends on facebook.

3. They used to buy 3rd party data that they could join on your phone and email address.

Sure some HN users might have enough knowhow to block most of their tracking but 99% of facebook users don't.


I want to see minutes per person


I want to see the number of people who spend 8+ hours per day on the site.


No one you know uses Instagram and/or Whatsapp?


90% of the people I know use Facebook. And more people use Instagram. Most people over 30 use Facebook.




It's crazy to me that in all these 80+ comments about FB disinformation there's no mention of the most dramatic recent example of Facebook mass manipulating culture - their admitted systematic censorship of the lab leak theory [0].

They even banned some of the posters. This was done based on the word of a paper in The Lancet that later turned out to have 26 out of 27 authors connected to the lab [1].

Is that insane culture attack not worth a mention, if the topic is disinfo on Facebook? The damage from that particular campaign is hard to measure, but what's not in doubt is that they gaslit the world on the flimsiest of "scientific" bases, and are getting away with it scot free.

0 - https://www.theguardian.com/technology/2021/may/27/facebook-...

1 - https://www.telegraph.co.uk/news/2021/09/10/revealed-scienti...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: