Hacker News new | past | comments | ask | show | jobs | submit login
News Feed FYI: Addressing Hoaxes and Fake News (fb.com)
84 points by envy2 on Dec 15, 2016 | hide | past | favorite | 141 comments



Remember :

- It all started with Facebook being accused of manipulating the timeline to remove conservative stories during the campaign

- At the time, their PR was along the lines of : we will fire the staff and replace it with an "objective" algorithm

- Shortly before the end of the campaign, the Fake News narrative started to be tested under various forms ("conspiracy theories are going mainstream", "damaging rumors" etc...)

- Post election the Fake News narrative started to be really pushed by politics

- At the same time, FB started to announce actions to fight the so called "Fake News Problem".

- And these day, hearing FB and mainstream media, the rethoric is as shameless as "We need to clean the news".

Having been caught doing some very troubling things, FB answer is to double down with what would be called censorship if it was being done in China.

Shutting down opinions, based on the opinions expressed can remind certain patterns of a not so old history. I find it shocking that people accept this development so quietly. Everybody seems to be ok with that. It seems like citizens are no longer awake and fighting for the bill of rights, seriously folks how can something like this possibly happen in this country ? Are you realizing what is happening here ?

In 2016, the average citizen thinks he is a good citizen and is all proud by being "politically correct" for a number of small things (often ridiculous and preventing him from maturing his own opinions) but everybody is comfortable with movements such as "Cleaning the news" being organized in plain sight. That's how you know something is fucked up. This is a shame.


"It seems like citizens are no longer awake and fighting for the bill of rights"

Nothing in the bill of rights applies here. You don't have freedom of speech on Facebook - They make the rules and decide what is allowed and not allowed. It's a company, not the US Government. If enough of their users are upset that fake news like "The Pope endorses Donald Trump" (a real example) is surfacing constantly on their news feeds, it is in Facebook's best interest as a company to try to remedy the situation. They make money when people are actively using their site - and if the "Fake News Epidemic" is causing people to stop using Facebook, they'll try to stop it.

Nobody is making anybody use Facebook - it does seem kind of silly to me, but clearly they see value in this (or not doing this was losing them money). What we really need to do is stop treating large corporations like they are a government entity and getting upset when they start restricting "free speech" - they have no obligation to allow people to post anything they want, and can manipulate your timeline as much as they want because you signed their ToS.


I just posted on an another thread that the problem here is that Facebook positionning / PR is to be say that they are a neutral agent.

"Share what you want with your friends / family etc." This is why this case is different. If I use ExxonMobilBook, I kind of know what to expect of Greenpeace related stories.

Given their size, extra-precaution needs to be taken. They either need to rollback, or make their agenda more appearent.


"They either need to rollback, or make their agenda more appearent."

That is an opinion. They, in reality, don't need to do anything, nor do they owe you anything. Just because their positioning/PR makes them seem neutral, that doesn't mean they are required to be. They don't need to take precaution, and I will eat my shoe if this decision is what causes Facebook to lose users. Fact of the matter is, 99.9% of users don't care, .09% are cool with facebook doing this, and .01% (you) want them to roll back.


To be fair, facebook can, if it wants, put full blown nazi propaganda on their network and delete anything anti-nazi. Of course just because it's legal for a company to do that doesn't mean it's ethical and okay and we should just accept it. Similarly, while it's perfectly legal or facebook to select news stories to try and politically influence it's users, that would be highly unethical.

To be fair to facebook though, reddit has been actively caught doing this repeatedly so I don't know why everyone is harping on facebook instead of reddit.


> 99.9% of users don't care

More like 99.9% of users don't know enough about the situation to warrant an opinion.

I mean, if you asked random passerby on the street about Facebook's neutrality in terms of freedom of speech, I guarantee you that 99.9% of them will respond with something along the lines of "I don't know, I just use it to share funny cat videos with my friends lmao xD" or "If you don't like it you don't have to click on it".


First of all, you have the chronology right but the causation wrong. After conservative pressure to stop editorially reviewing news, fake news has increasingly trended. Since the election, people have realized how untenable this is and so Facebook is looking for a solution.

This has nothing to do with the bill of rights. Facebook is not the government, government is not requiring fake news to be censored, and the fact checkers are independent organizations.

You don't have an inalienable right to spread whatever political lie you want to millions of people. Facebook is this era's nightly news and, just like the nightly news, they have a right to avoid sharing crazy conspiracy theories and outright lies.


>Facebook is this era's nightly news and, just like the nightly news, they have a right to avoid sharing crazy conspiracy theories and outright lies.

I have no problem with that. I have a problem with them choosing what I can share with my friends, and more generally what people can share with each other. "Sorry it's fake news so we limited the reach".

The problem is that they are not open. My only problem is opacity. HN is transparent, Breitbart is transparent. You know their opinions, you know their agenda. But this is not the case with FB which pretends to be neutral, and this is the issue.

>This has nothing to do with the bill of rights. Facebook is not the government, government is not requiring fake news to be censored, and the fact checkers are independent organizations.

Do you remember the anti-trust suits against Microsoft ? You can't pretend that a company having the influence FB has is just a corporation like another. When you have this size, you have added responsibility / expectations since your actions directly impact a significant part/aspect of the country.

Moreover they coordinate with politics, so this has everything to do with the Bill of Rights at this point. The citizen should at least pay attention to what is happening.


> I have no problem with that. I have a problem with them choosing what I can share with my friends, and more generally what people can share with each other.

They're not policing what you share with your friends, just what you can do on their platform. They are under no obligation to provide infinite reach for whatever drivel you want to share.

If you're sitting in a bar sharing racist anecdotes with your friends, that bar has the right to kick you out if they feel like. It's their bar.

If you don't like what Facebook's policies, go share somewhere else. There's even an alt-right social network you'll fit right in on.

> The problem is that they are not open. My only problem is opacity.

They're completely open about it. They've posted multiple stories about how they're addressing this, will provide links to third-party analysis for any flagged stories, and list the criteria for third-party fact-checking. [0]

> Moreover they coordinate with politics, so this has everything to do with the Bill of Rights at this point.

No, it does not. The Bill of Rights restrains what the government can do, it has nothing to do with what individual corporations can do.

Do you also think Smith & Wesson should be required to provide free guns, since the right to bear arms is in the constitution?

[0] http://www.poynter.org/fact-checkers-code-of-principles/


> If you're sitting in a bar sharing racist anecdotes with your friends, that bar has the right to kick you out if they feel like.

Except for a lot of people there is no other bar. Your friends are in that bar, the bar has used their position to close other bars. And the only other bar still open is 20 miles away in a basement.

> If you don't like what Facebook's policies, go share somewhere else. There's even an alt-right social network you'll fit right in on.

The media is pushing the line "Alt-right == Nazi", so you are calling someone a Nazi because they have issues with Facebook controlling what their users see. After Facebook has already admitted to large scale manipulation of their users emotions[0].

[0] https://www.theguardian.com/technology/2014/jun/29/facebook-...


There are tons of other bars. What exactly has Facebook done to close other bars?

Twitter is still around and very popular. So is Reddit. Heck, there's even a social network built just for people like you: https://gab.ai/


>It has nothing to do with what individual corporations can do.

My point was that we should pay attention as "citizens" because Freedom of Speech is / may be concerned. I wasn't trying to build a legal case with references accurate enough to be received by a judge. This is a about "Freedom of Speech" is the only thing I was trying to say.

>If you don't like what Facebook's policies, go share somewhere else

You're ignoring my point on the fact that FB is not a corporation like another. According to you Microsoft could have gotten away with : "People can use another OS" in the 90s, but this is not how things worked out. We need to respect law and understand why this kind of laws have been put in place. We paid a huge price before understandind why we needed them, repeating history will be expensive.

I can understand your point, I hope you could understand mine too even if you disagree.

Finally

>Will provide links to third-party analysis for any flagged stories, and list the criteria for third-party fact-checking. [0]

Thanks, this is interesting. Let's see how it turns out.


> If you don't like what Facebook's policies, go share somewhere else.

Okay, I'll just go find another social network where millions of non-technical individuals congregate en masss.

Oh wait...


Facebook censored me when I tried to share the Snowden Documents.

I barely trusted them before and haven't trusted them since.


> I have no problem with that. I have a problem with them choosing what I can share with my friends, and more generally what people can share with each other. "

They don't choose that, they choose what you can share via their platform. You can share whatever you want with friends, etc., using your own resources instead of theirs, and Facebook has no say.

Facebook limiting the use of Facebook's resources to things Facebook is comfortable with may not make them your ideal service, but why should they be obligated to be that?


> Facebook limiting the use of Facebook's resources to things Facebook is comfortable with may not make them your ideal service, but why should they be obligated to be that?

Ideally, they shouldn't. But, as of this writing, they have a virtual monopoly on social media.

- Want to talk to your friends? Message them on Facebook Messenger/WhatsApp/Instagram.

- Want to share photos? Send them via Facebook Messenger/WhatsApp/Instagram.

- Want to see what kinds of events are happening in your city/town? Facebook Events.

- Want to see different stories of the day (the whole "staying informed" thing)? Facebook Pages

- Want to use a different service other than Facebook? Good luck, because everyone else who may not be as skilled in tech as you are still hooked on the service.

If another IM or events service gets popularized, then FB's foothold might be loosened. But, until that happens, we have to actively make sure that the things shared on the platform are as free as possible.


  This has nothing to do with the bill of rights. 
Agreed.

  You don't have an inalienable right to spread
  whatever political lie you want
In the U.S., you do. "Real" media do it all the time. There are laws against making slanderous or libelous statements against individuals and entities, but every American has the right to say, "the moon landings were faked and greens are smarter than libertarians" if they want.


You dropped the most important part of that quote: "to millions of people."

You're free to go on any street corner and rant about the moon landings. Facebook just isn't under any obligation to give you a platform to reach millions of people with that.


Even Facebook readers can post such things, and Facebook staff can remove it if they choose. Otherwise, you are seeking prior restraint.

Typically, an individual lacks the reach to make statements visible to "millions" of Facebook feeds, unless they are celebrities.


Sure, but my point is that Facebook is no under obligation to allow you to post that stuff and is free to remove it whenever they feel. They could delete Breitbart's page tomorrow if they felt like it.


> You don't have an inalienable right to spread whatever political lie you want to millions of people.

Well, that's arguable, but in any case, and perhaps more to the point here, in the US you don't have a Constitutional right to force any private party to cooperate with you in spreading any political idea (lie or not) you want, even when you have the right to spread that idea yourself.


OOh, juicy controversial comment. I hope some people take the time to read it instead of passing over it because it's slightly greyed out.

I for one don't agree that reddit and facebook have a "duty" to remain uncensored. If Breitbart is allowed to be overtly pro-Conservative while claiming to be the only "real" news source, reddit and facebook should be allowed to do whatever they please. All the other news agencies do it, what makes facebook so special?


Isn't that a problem when a news that says that the Pope supports Donald Trump gets over 1 millions shares? This isn't opinion, this is pure fabrication.


A guy fired a gun in a pizza place, this is facebook's response. That guy arguing this is an affront to the bill of rights, seems to have never read the bill of rights.


The article could go on and make the case that the actions of the Pope are not aligned with his rethoric which shows that nothing is as black and white as what you suggest.


The example you responded to isn't a "What If" scenario - it actually happened. The headline and body of the article claimed Donald Trump was endorsed by the Pope with no evidence. It's fake news no matter how you look at it.


>The example you responded to isn't a "What If" scenario - it actually happened.

Then you are acually making my case. My point was that it would be hard to call that a fake news, this is an opinion.


You are gas lighting.

It is not "opinion" on whether the Pope endorsed Trump. It is unequivocal fact that he did not. Reporting to the contrary is fake news.


As long as they stick to flagging/marking posts while keeping them visible I don't see how that's different from how HN works. Are you opposed to that as well?


>Are you opposed to that as well?

HN never pretends to be neutral, it is a community with a set of rules. For example we even expect them to filter out-of-topic posts, and we know very well the opinions of the moderators / founders and where their line in the sand is.

However FB pretends to be neutral. If it's only flagged and there is no impact on the ranking algorithm, of course there would be no issue. It would just be a way to know FB opinion. But, frankly, we know better than that, of course the ranking and the reach will be impacted. This is even the stated intent.


ok, so now the obvious question becomes, who fact checks the fact checkers?

i don't blame facebook, but the people that have been asking for this i guess are about to get what they deserve: more consolidation of the power of persuasion of the masses into the hands of facebook. want to instantly discredit a story? just UPDATE stories SET disputed = t WHERE ID = ?


Why cant facebook just go after blogspam clickbait in general? If a story is a copy of a copy, have the algorithm favor the original instead of the blogspam. If a local news agency copies an AP/Reuters story word for word, have the software suggest replacing the replica with the original. Treat local news duplicates the same as any wordpress+adsense blog. If journalists are adding commentary thats a different story (albeit most commentary is a needless act of faux value addition anyway) but mashable et all are just embedding ingur/youtube content with a catchy headline and the letters h/t appended to the end. This is a continuation of what ebaumsworld did since foreverago. Deaggregate the aggregating middlemen.

Ripping the embed out of the blogspam, and turning it into a direct link, is trivial. If the only content on a page is a youtube video covered in a hundred ads, just return me the youtube video. Flag websites as blogspam, and more agressivly return just the content they "stole." Make it hard to get off that shitlist. Id rather facebook make a little money deconstructing the web, than whomever spun up wordpress and paid for a bad theme being rewarded for polluting it. Facebook has an opportunity to add through reduction; to add value by finding signal in noise, leaving noise on the cutting room floor. Deduplication should be something incredibly easy to do algorithmically, compared to fact checking. (Hell facebook is already half way there with trending, anthough it too often lumps semi-unrelated events.) Google News and Techmeme do a great job of it. Techmeme eloquently spells out the solution http://news.techmeme.com/111031/techmeme-revealed

Too many different ideas are getting conflated here. Clean up the spam/dupe mess FIRST, and sifting the sensationalized salon/dailycaller hyperbole from the blatant false news becomes easier (because we then arent are drowning in nearly identical but different copies of articles.)


Maybe follow the link and evaluate the ad:content ratio or just the raw number of different ad services on the page. Seems like the collateral damage to such an approach would largely improve the overall web experience.


If the ads load quickly, from few domains, and work well in mobile, you can assume it is not a legit news site.


I do think it important to consolidate duplicates of stories, and credit the original copy. Maybe a feature (that users can disable if it fails) that automatically finds the source and replaces the post with the original.

Maybe two versions of the newsfeed, the default "sterilized" and a hidden but accessible "raw"


> Why cant facebook just go after blogspam clickbait in general?

Hard to do with just a few algorithms. They would need human eyes reviewing sources. And since Facebook seems keen on eliminating its human workforce through AI/ML, I doubt that they would suddenly embrace human reviewers again.


I entirely disagree. You extract the content, and diff it against other extracted content. You dont need a human eye to determine that NBC11 republished an AP/Reuters story word for word. This should be somewhat basic pattern matching. We are talking about a company that can brute force vanity onion addresses!

Semantic analysis should be able to detect procedurally generated content farms. Human turking might be harder to detect, but once a site gets flagged, all its posts can be checked with stricter scrutiny.

It is extremely easy (from a computational standpoint) to rip a youtube embed out of a page and directly link to the source. If mashable and yournaturaldietnews are known content embedders, more aggressively deconstruct their pages.

There also has to be a way to crowd source content verification. They have 1.8 billion people, a subset of those people give good feedback. Maybe some peoples reports should be weighed more heavily than others, if they have a history of making valuable reports.

Maybe youre thinking fake news is mostly political. A lot of it is DIY and Health Tips, tech lifehacks etc. Rehosted videos with a banner added on the top and bottom. Animals. Any content taken from elsewhere can be easily detected, similar to Tineye or reverse image searching.


I hate blogspam, but Facebook has to be seen to be taking direct action against fake news right now. They can't afford to take detours to fix other problems first.


they are one in the same problem http://www.phdcomics.com/comics.php?n=1174

quite honestly, "mostly fake but slightly rooted in truth blogspam" is 10000x the problem actual straight fake stories are. vitamins, detox, antioxidants, etc, etc, are a much larger pool of fake news than fabricated propaganda. people are making up outrageous stories for ad revenue, not to win elections. state sponsored narratives are unlikely to get flagged by these types of measures because they will be published by credible sources. I doubt NYT will start failing snopes tests. I doubt every SPLC post will get flagged for being out of context or unverified.


I think you misunderstand the format of a fact check. It's a distinct piece of content that has its own schema.org markup. It should include only editorial analysis and not opinion. It features reference sources and encourages readers to visit them. It is in its ideal form pure objective journalism. Sites adorn themselves with cute logos about the veracity of the claim but really they should be serving a compilation of facts. If they fail to do so regularly Facebook's credibility is on the line so they will be ensuring they only link to quality sources.

They couldn't just batch discredit a topic because they need something to link to.


> who fact checks the fact checkers?

Who polices the police? Who governs the government? Who grants authority to the authorities?

In western democracies, ideally the answer is all the same: the people. Ideally, once a power broker starts to lose credibility, they start to lose their power.

And that's really what this is about. Facebook has lost credibility given the ease these new social technologies have been co-opted into propaganda tools. Sure, on one level they might want to discredit stories, but I think the far more likely scenario is Facebook as a platform for eyeballs has been discredited, and fewer people taking it seriously hurts the stock price.

Even as a developer (and potential employee lead), I've more viciously ignored the quarterly facebook recruiter cold-call inbox spam because I frankly don't want to participate in what's become a propaganda mill this election season... I use react and recommend the technology all the time, I have friends who do great work at facebook, and most importantly, I actively use it every day... I just personally want to contribute my efforts elsewhere (after all, cigarette smokers smoke cigarettes every day too).

At the end of the day, the credibility of the fact checkers comes from community consensus on the legitimacy of fact checkers. The problem is that micro-targeting groups while maximizing eyeball engagement has connected these groups more closely while breaking community consensus. Without community consensus, fact checkers have no credibility (and neither will Facebook the eyeball platform).

In other words, the paradox is we're more social yet more isolated at the same time.

I appreciate this fact checker gesture, but at the end of the day, facebook's real credibility problem is they need to align their profits with community and community consensus, not being individually social.


As a followup thought,

> facebook's real credibility problem is they need to align their profits with community consensus and being community-social, not being individually social.

The best litmus test for this is going to be whether everyone will see the same "disputed by 3rd parties" warning, or whether that warning will be micro-targeted to individual echochambers like everything else is.

If my echochamber will show a warning next to a breitbart piece but my conservative uncle won't see it because his echochamber disagrees with the fact checking, that's going to be a huge red flag that facebook is only continuing the individually-social behavior that created this credibility problem in the first place.


"fact checkers comes from community consensus on the legitimacy of fact checkers..."

There's a danger there. "Community consensus" condemned Galileo when he dared mention a heliocentric solar system, for one example. Did that make the Earth-centered solar system "fact"?


Community consensus also eventually vindicated him.

More importantly, what's the alternative? Turning it all off and thinking independently is one option. Kudos to those who do that, but if you're here reading this, by definition you want to participate in social media. So then the question becomes how do we build a less toxic and more productive social media?


> So then the question becomes how do we build a less toxic and more productive social media?

1. Enforce a basic standard of conduct. That is, define the point at which a conversation becomes toxic. Encourage people to "respectfully disagree" with each other so conversations don't degrade too far.

2. Enforce posting/commenting styles similar to something like the AP Style Guide. Encourage proper punctuation and grammar throughout the platform. Also encourage the community to enforce this on their own. (Yes, I am a grammar nazi in this case. But I'm tired of seeing things like "u r stoopid !!!" as a post or comment.)

3. Hire or appoint real people to review complaints. Do not rely on an automated system for this.

4. Use a temporal based feed only (newest posts first). Do not show an algorithmic feed "tailored" to a user's current interests.

5. Operate as a small group/company. Don't try to become a publicly traded multinational corporation with too many investors to answer to.

6. Be as transparent and neutral as possible. Have a place for users to chime in on the development of new features and bug fixes.


> just UPDATE stories SET disputed = t WHERE ID = ?

It's not that easy. They need provide a link with an explanation why a certain story is disputed.

It's true that Facebook has a lot of power. But that was already the case before this update: they could manipulate what stories a user saw, which could be used to influence a large part of their user base. I don't see this as adding that much power. It's also easier to check this mechanism than feed manipulation.


> We’ve found that a lot of fake news is financially motivated. Spammers make money by masquerading as well-known news organizations, and posting hoaxes that get people to visit to their sites, which are often mostly ads. So we’re doing several things to reduce the financial incentives. On the buying side we’ve eliminated the ability to spoof domains, which will reduce the prevalence of sites that pretend to be real publications.

What do they mean "the ability to spoof domains"? Was that a technical issue? Or do they mean they're blacklisting sites such as http://abcnews.com.co/, progenitor of such hoaxes as El Chapo escaping from prison (again) [0]

Given how easy it is to fool people with ambiguous characters/names, blacklisting URLs that aren't all ASCII might be a good precaution too.

[0] https://en.wikipedia.org/wiki/ABCnews.com.co



I saw people share fake articles from that site on facebook. It is a big problem because non-technical people do not know that abcnews.com.co is a subdomain of com.co, not a subdomain of abcnews.com.


> non-technical people do not know that abcnews.com.co is a subdomain of com.co, not a subdomain of abcnews.com

This is a huge issue. A lot of HN always seems to forget that there are a lot of users out there who are not as gifted with technology as we are. (Case in point, my mother just learned what HTTPS meant last week through an NBC Nightly News segment.)

Unfortunately, it seems to be largely generational in nature.


Isn't .com.co a second-level domain for Colombia?


I wonder if this will be as sketchy as politifact? I hope it won't be, but I reckon it might be. (What I mean is: extremely uncharitable readings of things that they don't like, but overly charitable readings of anything that they like.)


Politifact is one of the organizations on the approved list [1], and the closest thing I can find on that list to an organization anyone not on the left is likely to regard as politically neutral is the Associated Press. So I'm not really seeing what Facebook expects this to help.

[1] http://www.poynter.org/fact-checkers-code-of-principles/


Maybe stop the spread of nonsense like 'pope endorses Trump' or pizzagate?


I wonder how this would affect, say, the claim that bulk marketing emails for Trump hotels were actually secret communications with Russia, as reported by Slate and repeated by a certain @HillaryClinton on Twitter. (And, for that matter, the associated belief that the FBI concluding they were a non-issue was proof that the FBI are backing Trump.)


would these measures end up blocking something like a modest proposal


A Modest Proposal is not, and never was, "News". It is opinion.


Most news is (unfortunately) nothing more than opinion. Either the opinion of "an unnamed official", the opinion of a contributor, the opinion of a sponsor, or the opinion of an editor.

Take the recent opinions of anonymous officials that Russia's objective in leaking DNC corruption was to get Trump elected. That's being reported on as news and fact, with the opinion aspect of it turned down to obscurity.


maybe Best In Show, that dog documentary, is a better example.


Maybe reality doesn't match your opinions.


If one suspected this whole effort will be focused far more on politics than on truthiness (I know... amazing!), you've just confirmed that suspicion. Congratulations, I guess.


You're the one focused more on politics than truthfulness.

What exactly do you disagree with in this code of principles? [0] Or is the problem that reporting reality accurately undermines your fantasy?

[0] http://www.poynter.org/fact-checkers-code-of-principles/


Haha yeah after one post I'm "focused". Meanwhile you have seven posts on this page, none of which obey the Principle of Charity. There's still time to edit!

I see nothing wrong with the document you link.


Maybe it takes a little more to convince anyone of anything than merely to say "no, you're wrong" over and over and over again. Would that the world were so simple, perhaps. Evidence suggests that it is not.


I agree that saying "you're wrong" isn't an effective rhetorical technique, but trying to argue with alt-right trolls is a waste of effort.

Sites like Snopes and Politifact do the research, with primary sources, to actually validate claims. Since conservatives can't actually invalidate those claims, they've decided to simply claim that any source which disagrees with them is biased.

It's gaslighting.


The evidence is often correct. It's the assignment of weight to each claim, lack of charity when reading, and therefore output of the truth-o-meter which is biased. This isn't just rhetoric; it can be evaluated rationally.

Worth scrolling through this Twitter feed [0] for individual examples. They can be validated and many are not just 'gaslighting by the right'. And others feel this way, too [1].

[0] https://twitter.com/PolitiFactBias

[1] https://twitter.com/jamestaranto/status/809480523693387777


To some extent, I agree that Politifact's ratings are bad because they're inherently subjective. Thankfully Facebook's system will allow you to click through to the original article and read the evidence for yourself.

The latest claim is that even Snopes is politically motivated and lying. The amount of gas lighting which the alt-right is doing is absurd.


> the closest thing I can find on that list to an organization anyone not on the left is likely to regard as politically neutral

The American right's association with reality is well known to be, uhh... tenuous. So it's not too surprising that they tend to disagree with people who verify facts.

Edit for downvoters: The mainstream positions of the Republican party are, objectively, factually incorrect on the issues of climate change, evolution, and women's health. I'm not aware of any issue where the left's position completely disagrees with established scientific fact. Thus, anyone concerned with basing their understanding of reality on facts is going to be "left-leaning" by simple virtue of the right's disregard for fact.


Please keep partisan politics off HN.


I was explaining why fact checkers appear to be left-leaning, but OK.


Many on the right consider theirs to be the only political tendency compatible with reality, too. They're incorrect in so considering, too.

I've vouched your prior comment because I believe its content should be visible. It would be a shame if anyone were to confuse that action with an endorsement.


How do you come to the conclusion that politifact is as sketchy as the sites that were churning out fake news during the election?


It's not but it is often very misleading.

I don't think the fake news is actually what won the election anyway. If anything it was Wikileaks' extreme authenticity that won the election, and I'm struggling to understand how this is 'fake'. I'm assuming that they're trying to delegitimise it by association however.

I also find it hard to believe that it was Russia that orchestrated the leaks with zero evidence shown. It's not so easy to trace cyberattacks, and it just seems like officials are looking for a scapegoat for why they didn't win.

There seems to be just as much fake news coming out of the left as from the right, albeit smarter less off-the-wall/gutter stuff.


I agree with this, but I suspect the bigger factor in the election result was that the "straight" news media had stopped going through the nonpartisan motions. They forgot that the tradition of the appearance of objectivity is more for the benefit of the news media than for the public. Once no one has a halo of open-mindedness, there's no reason to listen to Brian Williams over Ed Schultz or Sean Hannity. They could push a more conservative or independent public slightly to the left or to the center, but only slightly, and only when that's all they tried to do. This time they pushed too hard, and the whole edifice collapsed.

Frankly the news media seems not to have learned this lesson. If anything the last few weeks have seen them even further off the reservation. When in a decade it's clear that the institution of journalism is completely dead, this episode will loom large in explanation. "Journalists" are seriously arguing that providing relevant information about political candidates is an "attack on the nation". They're mortgaging all their credibility for the latest unsourced meta-analysis of months-old public information spoon-fed to them by petulant spurned spooks. Every day they come up with something goofier, and yet they can't escape the impression that we've heard this all before.

What stands out the most is the complete lack of objectivity. If a Republican candidate were crying about losing due to anonymous (even if vaguely Slavic...) revelations about her banal tawdriness, they'd be falling over themselves to laugh at her.


I would expect that behaviour from any political news fact checker, humans being what they are.


I thought Politifact was pretty well regarded, and actually tended to be a little too risk-averse in calling out bullshit.


Nope. Politifact will generally use the most generous possible reading of what a Democrat says, or the least generous reading of what a Republican says, when determining truth or falsehood.

Take a look at these two stories:

http://www.politifact.com/truth-o-meter/statements/2016/oct/...

http://www.politifact.com/truth-o-meter/statements/2016/sep/...

Now, the first story is simply true. Hillary Clinton did attack her husband's mistresses and accusers, and there's a long history of documentation of that. But Politifact labels it "mostly false" because it was never proven to have come directly from her. As if Sidney Blumenthal just took it upon himself to slander Lewinsky as a delusional nymphomaniac.

In the second link, Clinton claims none of her emails were labelled Top Secret. Note the dodge, here. She didn't say, "I didn't send or receive classified information", she said "those emails didn't contain the correct headers". I wouldn't be surprised in the slightest, given that we know she tried desperately to modify the records of the emails once they were discovered, if those headers were simply stripped either before or after the fact. Either way, she is simply not stupid enough to not know what she was doing. But she claims innocence through stupidity, and politifact labels it "mostly true".

What do both of these examples show? That when Hillary walks the line and is careful not to explicitly say anything demonstrably false, while clearly lying in substance, politifact says she's honest. When Trump says something that's substantively true, while not paying attention to specifics, politifact says he's lying.

This is not intellectually honest fact-checking. Letting one candidate play word games and lie while calling out the other for being less careful but telling the truth is not fact-checking, it's shilling.


> Hillary Clinton did attack her husband's mistresses and accusers, and there's a long history of documentation of that

> But Politifact labels it "mostly false" because it was never proven to have come directly from her.

These statements are mutually incompatible. There can't be a long history of documentation for something which is proven.

Start your own fact-checker. Your examples just affirm that Politifact is doing it's job correctly.


If all of your employees are doing one thing, it's safe to assume you've told them to, even if your fingerprints aren't on the proverbial gun. Your comment is exactly what I'm accusing politifact of: you're being as uncharitable as possible to the claim, ignoring common sense and plain language in order to find a reading of it you can say is false.

Thing is, you're not pretending to be impartial and non-partisan. Nor am I. Politifact is. When Politifact's standard of objectivity is indistinguishable from that of a reasonably intelligent random commenter on the Internet, that's a problem.


  There can't be a long history of documentation for something which is proven.
Um, what?


That was a typo. It should be:

> There can't be a long history of documentation for something which is unproven.


> Hillary Clinton did attack her husband's mistresses and accusers, and there's a long history of documentation of that.

Source?


From http://mobile.nytimes.com/2016/10/03/us/politics/hillary-bil...

But privately, she embraced the Clinton campaign’s aggressive strategy of counterattack: Women who claimed to have had sexual encounters with Mr. Clinton would become targets of digging and discrediting — tactics that women’s rights advocates frequently denounce.

The campaign hired a private investigator with a bare-knuckles reputation who embarked on a mission, as he put it in a memo, to impugn Ms. Flowers’s “character and veracity until she is destroyed beyond all recognition.”

In a pattern that would later be repeated with other women, the investigator’s staff scoured Arkansas and beyond, collecting disparaging accounts from ex-boyfriends, employers and others who claimed to know Ms. Flowers, accounts that the campaign then disseminated to the news media.

By the time Mr. Clinton finally admitted to “sexual relations” with Ms. Flowers, years later, Clinton aides had used stories collected by the private investigator to brand her as a “bimbo” and a “pathological liar.”

Mrs. Clinton’s level of involvement in that effort, as described in interviews, internal campaign records and archives, is still the subject of debate. By some accounts, she gave the green light and was a motivating force; by others, her support was no more than tacit assent.

----

The Clintonian strategy of attacking, slandering, and blackmailing his accusers is a matter of historical record. And there are many people in the know who said she was a driving force behind it, while others claim she merely acquiesced to it. You can read Hitchens recount the sordid affair here:

http://www.vanityfair.com/news/1999/05/christopher-hitchens-...

Or read No One Left to Lie to for a more in-depth look.

Now you don't have to believe she was the one driving the bus in the smear campaigns against her husband's accusers, even though plenty of people who were there at the time have said she was. But it does show that Politifact is being especially uncharitable to that claim, when they simply could have said it was disputed.


Facebook seems to be stumbling down the same page that Twitter did here in terms of attempting to regulate some content, by means that they themselves determines to be un-biased and in the best interests of users... which hasn't quite worked out so well for Twitter.

If everything is fine and unmoderated, there is no bias. If some things are flagged, not allowed, etc, I don't care how much a 3rd party is trusted. It's still humans picking and choosing one piece of content over another, and there are inherent biases involved there.


Unmoderation = infinite spam.


Spam platform = infinite spam.

Facebook makes its money from spam, as do the news.

Because they've lost credibility there's this new huge reach to label things as condoned or not. That's not going to solve the ultimate problem, which is that the information industry has become a platform where the highest spam bidder wins dissemination on the platform, rather than what spam is shared by people.

This is just one more step in the centralization of control of social media. There's still going to be plenty of spam, it just has to pay the gatekeeper (Facebook) or otherwise be on its good side.


Exactly. What someone else might call spam, Facebook might call "great news."

If having free speech produces "infinite spam", then that's a byproduct worth the benefits of not being limited on what ideas are ok to say and what ideas are not.


Still think this is a horrible move for a variety of reasons but I have one big question.

Where can I access the reasons and organisation who have disputed the link?

Without this information, this is even more terrible than I had previously thought it would be. If we cannot see who disputed the link and why, this is massive potential for abuse.

edit: Apparently the reason is given in a link


> If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why.

You will be able to click through to read the full analysis of why a story is false.


Ah bugger my bad, missed that. I hope the analysis also identifies the organisation/s who provided it.


Presumably it'll be a link to the article on that organization's website.


I really hope that each disputed link is given a comprehensive response and not a cookie cutter response.

I think a template response is probably the most likely outcome, especially considering the number of articles that are going to be checked by these organisations. Do they really have the resources available to give a bespoke response?

I think there is danger if these organisations are allowed to submit template responses.


The ultimate issue here is that these information providers need to train their users to be gullible, as this is their business and partnership model.

These platforms only function because advertisements and political messaging have to be seen, and are seen, as credible and native to the content being traded by the people using the platform.

Users are trained to uncritically believe the things that show up on their feed, because if they were trained to be skeptical they would the platform would go bankrupt.

Similarly, news media requires readers to have full faith - almost ideological allegiance - in what they are reporting, while at the same time they need an uncritical readership so that they can promote political and consumer advertisements, sponsored and 'earned' content promotion and opinion pieces as though they were fact.

The entire industry functions on the basis of gullible and uncritical readership, media allegiance and sponsored content.

The industry has fallen to extremely low credibility ratings, and a series of nonsense and unpredictive and obvious reporting out of the 'most credible' parts of the industry have left a vacuum whereby readers do not know what to believe, and so accept whatever is most convenient and most readily available.

The solution is to create a credible news media industry, with no sponsored or partner content, no PR, no government propaganda, and where the readers (and skimmers) are the customers who pay for the information. The industry should compete to provide context and reliable information, and readership should be encouraged to be skeptical of the information presented to them at all times.

Maintaining the current mispractice of the poorly functioning industry and adding protectionist measures, gatekeeping and sponsored fact-checkers is not going to solve the problem.

"I'll tell them what to think" is not a solution to "these people are gullible."


"uncritical readership"

I'm guessing you've never worked for a news organization.


I freelanced.

To be clear by 'uncritical' I mean 'gullible' - as in 'uncritical thinking'.


Once a story is flagged, it can’t be made into an ad and promoted, either.

After a single flag from a single user?


> We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed. It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share. Once a story is flagged, it can’t be made into an ad and promoted, either.

So, "flagged" means "determined to be fake by a fact checking organization contracted by Facebook that disputes the article".


Yep, just caught that myself. Sorry, it was two large screenshots up and I missed it. :-) Thanks!


I kind of wish they took no action. It is inevitable fb will be the "war of the worlds" or National Enquirer wrt "news" doing nothing gets them there quicker.

Fb is unparalled in sharing photos and personal information to friend. News? Not.


Nothing can help an ignorant or illiterate population. If that is your premise the next logical step is censorship and this is censorship.

How does the arrogant assumption of ignorant readers so weak willed they can be 'led ashtray' by 'fake news' even take hold? Do people really believe that 'other people' believe everything they read?

The ridiculous idea that people are naive is a fabricated construction designed to empower some while 'infantalising' the rest and is inherently anti-democratic.

Ordinary scepticism and not believing everything you read or hear is a basic human skill.

This fake news scare and blame 'Putin' for everything wrong in the world is so illiterate, juvenile and insulting to basic human intelligence and reflects something dangerously wrong in our media, politics and the tech community.


The sad thing is, most people who share fake news will just claim that Facebook is now part of the Soros-Clinton conspiracy to support the MSM and hide the truth from the public. I like that they have a specific set of guidelines for determining the veracity of a news story, and I'm sure this will go a long way towards removing the financial incentive, but I'm guessing it won't sway many minds.


How about a setting to block all link shares if this is supposedly such a huge threat?


Expecting corporate media to sanitize itself might be a new proof of Einstein's rule of insanity. The only short term solution to rampant widespread disinformation in the MSM is a healthy and sustained level of cynicism. Its time the media was truly disrupted and dispatched to the bottom of the see.


The timing of this makes it seem politically motivated. I've seen plenty of fake news on my liberal feed. Things like 9/11 conspiracy theories, anti-vaccination stories, stories about the dangers of genetically modified foods, etc.


If there's an obvious partisan slant to the 'fact-checking' then people will eventually discredit the warnings at best and at worst they will become the butt of many censorship jokes.

The SNL skits are probably writing themselves as we speak...Girl posts on her friends wall "I'm so happy for you!!!!" [WARNING: 3rd party fact checker determined this to be false]

If that happens, the brand damage to facebook will be real


The warnings may be discredited but the disputed stories will appear lower on the news feed according to FB. "Appear lower" is the least controversial way for FB to dip its toe in this water, so we'll see how the censorship actually plays out.


I know. This almost feels even bigger than politically motivated. This seems to me like a full on attack on the open internet. How quick before Edward Snowden is producing fake news and a new generation doesn't trust him?


Does it matter what the motivations are if they are using the same criteria for filtering all kinds of fake news?


Humans in the loop means the criteria will not be the same.


What is the alternative? Right now the alternative is to add another layer of bias in between.


The alternative is to allow free and open communication that is unmediated, unannotated, and unmoderated by any authority, and to allow information consumers to decide what they should and should not read and believe themselves.

It's the difference between the web and a walled garden.

I get that FB is interested in mediating, annotating, and moderating content, since they are a walled garden.

That's why I choose not to use FB, which is my right as a web user.


Since you're not a Facebook user, why do you care?

They have the right to do whatever they damn well please on their website. This isn't breaking the "open internet." Nothing about the design of the internet allows you to dictate how people run their websites.


That's a pipedream. There will always be human interference, whether it be direct, or a consequence of some humans' decisions regarding algorithm design.

As it is, Facebook's current (pre-fact-checking) algorithm prioritizes shallow engagement over other metrics. That leads to sensationalist clickbait receiving preferential treatment. Human decisions led to this outcome.


I am all for free and open communication as long as it is unrestricted in all respects.

But facebook has been filtering content to show you what they think is relevant. In such a scenario, I'd prefer if they added moderation to weed out fake news or at the very least mark it as such.


No. That causes Pizzagate to happen.


Fake news was a massive problem this election cycle, and it really highlighted Facebook's large influence and their inability to actually do anything about it. So yes, Facebook dealing with fake news here is a reaction to what happened during the election, but that's not a bad thing. Fake news turns out to be a serious problem and it needs to be fixed, regardless of your political leanings.

Edit: Why am I being downvoted for this? If you disagree with me, please explain why.


There's no evidence that "fake news" is actually a serious problem, other than getting a lot of likes on Facebook. Seems to me "fake news" is an agenda-driven narrative by the mainstream media to salvage their waning reputation and discredit alternative media. For instance: https://theintercept.com/2016/11/26/washington-post-disgrace...


Yeah maybe, even the pope said fake news isn't real.


> Edit: Why am I being downvoted for this? If you disagree with me, please explain why.

Truth has become a partisan issue. This thread is overrun with alt-right trolls.


You're probably right. I notice you also got downvoted, as did a sibling comment. It's pretty frustrating to see this kind of behavior on HN.


Issues are often only addressed when they become important enough to have major consequences. That doesn't make reaction inherently political.

The amount of conservative gaslighting out there today is really troubling. Any source which doesn't agree with their biased version of reality is "liberal."

It's amazing to me how effectively the alt-right has taken over Hacker News to the point where fact-checking is considered harmful.


That last bit isn't an accurate description. What's going on is that the societies from which HN users come are deeply divided, and those divisions are inevitably showing up here, as they would in any large enough sample.


I just disable the FB news feed


The kosher fact checkers list includes the biased and opinion-based Politifact and Snopes.


We detached this trainwreck of a subthread from https://news.ycombinator.com/item?id=13186829 and marked it off-topic.


Fact check please.


They aren't. You'll have to provide evidence for your claim.


Let's see some statistical analysis that the fact checkers FB claims are disinterested arbiters of fact aren't in fact pushing an agenda before we go along with censorship maybe.


I'm not willing to call it censorship, and you still haven't provided evidence for your claim that two routinely considered reputable fact checkers aren't.


Can you supply any concrete examples of fact checkers that have undergone such rigorous evaluation?


Let me guess: any source which doesn't corroborate your worldview is biased.


Politifact and Snopes push their opinionated worldview as though it is objectively true and morally righteous.


> Your belief is that Politifact and Snopes push their opinionated worldview as though it is objectively true and morally righteous.

Wouldn't want to be guilty of the same sin, right?


I do not put myself out there as a disinterested arbiter of fact.

Facebook isn't going to be appending my critical analysis to alleged fake news stores.


Given this:

> I do not put myself out there as a disinterested arbiter of fact.

...all I can say to this:

> Facebook isn't going to be appending my critical analysis to alleged fake news stores.

...is quelle surprise.

Were you expecting some other outcome?


My argument is very simple but you apparently misunderstand.

morgante guessed that I reject fact checking simply if it doesn't confirm my worldview.

I rebutted that the fact checkers in question aren't simply disproving my worldview. Rather they are advancing an opinion-based counter-narrative under the guise of objectivity.

You replied and suggested that I should be held to the same standard as a purported arbiter of truth.

I rebutted that I do not claim to be a sterile, objective source of bald fact-checking. And a major news consumption platform is not appending my opinions to people's feeds and elevating my opinions to objective status. My opinions-masquerading-as-fact won't be advanced by Facebook as evidence of the disreputability of those I disagree with.

You replied incoherently as far as I can tell.


First I will explain the joke in my previous comment, since it seems to have gone over your head. Read what you wrote literally, as if someone else wrote it:

> I do not put myself out there as a disinterested arbiter of fact. > Facebook isn't going to be appending my critical analysis to alleged fake news stores.

The joke I made works--or doesn't--based on whether or not you can see those statements as a self-evident good thing: wouldn't it be good and proper that Facebook isn't going to be appending your critical analysis to alleged fake news `stores` (sic), given that--by your own admission--you aren't putting yourself out there as a disinterested arbiter of fact?

Now to be a bit more serious.

You claim to have an argument...but, where? What is your argument? Where can I find it stated, explicitly, in words?

I can't imagine myself walking away from this conversation saying "you know, despite my initial reaction `halpiamaquark` was actually totally right about...". I mean really, how should I finish that sentence? What's the point you came in here trying to make?

You come across as wanting the world to know you don't like Snopes and Politifact and...that's it?

The play-by-play as I see it starts with this:

> The kosher fact checkers list includes the biased and opinion-based Politifact and Snopes.

That's not an argument; it's a statement of your opinion, but presented as fact.

Now purely talking rhetorical strategies I can't blame you here; after all if you'd lead with this:

> The kosher fact checkers list includes Politifact and Snopes. In my opinion, Politifact and Snopes are biased and opinion-based.

...I think the natural response would be "...and I care because? Is there going to be a point here, eventually?". On the other hand, if you just state your opinion like a fact it's pretty good bait--you'll get people jumping in to play the "I'm going to respond to what I guess you're probably thinking" game.

That's what happened: `morgante` played that game and made a guess, and then, in your own words:

> I rebutted that the fact checkers in question aren't simply disproving my worldview. Rather they are advancing an opinion-based counter-narrative under the guise of objectivity.

I'd characterize this as (a) rejecting his guess at what you think and, then, once again, (b) stating your opinion about Politifact and Snopes as if it were fact.

We're now 3 comments in and there's still no argument: what, again, is the point you're actually trying to make? I mean it's quite clear that you think Snopes and Politifact are biased because you repeat that claim every chance you get, but what's the implication? Can you be direct and make the point you're trying to make, or is there really no point to be had here?

At this point I jump in. It was amusing to me that there was something about presenting opinions as fact that had you so mad--so mad!--while on the other hand you seemed to have no way of expressing yourself other than, well, presenting your opinions as facts.

So I did make a snarky reply but I have to disagree with your characterization of it:

> You replied and suggested that I should be held to the same standard as a purported arbiter of truth.

...no, that's not it at all. Where are you getting this "should be held to the same standard as a purported arbiter of truth"? That's all you and your quarky ways, and presumably all your unstated ideas and assumptions that could perhaps comprise an argument if you were to surface them explicitly.

All I actually did was (a) take your quote:

> Politifact and Snopes push their opinionated worldview as though it is objectively true and morally righteous.

...and prepend an "I think" to it, and then (b) point out that failing to do so would've left you seeming guilty of the same thing you are so mad--so mad!--about Politifact and Snopes (allegedly) doing.

Your follow-up response was where you finally introduce some actual statements of fact:

> I do not put myself out there as a disinterested arbiter of fact. > Facebook isn't going to be appending my critical analysis to alleged fake news stores.

...but despite finally providing something other than simple re-statements of your opinions, as a response this was incoherent.

I mean, yeah, I can guess what you might be thinking, but I'm not going to do your job for you.

If you want to claim to have an argument you have to, you know, actually make it: state your premises, show your steps, show your work, and how they all fit together to make the point you want to make. Otherwise you're just venting!


[flagged]


Please stop, both of you. This kind of tedious back-and-forth degrades this site and adds nothing of value.


Done and noted moving forward.

I don't envy you your job.

I also don't like your odds here: the current site mechanics and moderation pace don't seem adequate to preserve the good parts of the site in the long run, but any significant overhaul would also risk the same.

I wish you luck, with sympathy.


> "Because Politifact and Snopes are biased and opinion based, they are not credible to determine whether stories are 'fake'."

Even here you could go more honest and humble--"I think that Politifact and Snopes are biased and opinion-based to the point I can't see them as credible to determine whether stories are fake"--but such a modest approach would indeed be very counterproductive if your real purpose here is to, as you say, invite reactions and watch your karma to take the pulse of the community.

> Yes, I provided no evidence. I don't care to. What minuscule effect can I hope to have on a tiny handful of strangers on the internet?

None, with that attitude! Have a good evening.


Extraordinary claims require extraordinary evidence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: