Hacker News new | past | comments | ask | show | jobs | submit login
Facebook to ban misinformation on voting in upcoming U.S. elections (reuters.com)
102 points by crunchiebones on Oct 15, 2018 | hide | past | favorite | 132 comments



"On the issue of fake news, Facebook has held off on a total ban, instead limiting the spread of articles marked as false by vetted fact-checkers. However, that approach can leave fact-checkers overwhelmed and able to tackle only the most viral hoaxes."

Does anyone else find this concerning? Even the smallest bias held by fact-checkers can dramatically skew things towards one side e.g. only "fact-check" articles that favour one side.


You find it concerning because it's easy to abuse. The fact-checkers get to decide what is "true", and other Facebook users are unlikely to listen to you if you try and convince them that the fact-checkers are wrong.


Easy to abuse has nothing to do with it.

Positioning anyone to be an information filter is wrong.

Full stop.

This is totalitarianism 101. It doesn't matter if they're right or wrong. Freedom isn't predicated on moral superiority - it is the polar opposite. Freedom is exactly for those who are "wrong".

I cannot imagine a more dangerous position for Facebook to take.


Ok, then how do you prevent things like https://news.ycombinator.com/item?id=18225364 ?


If you want to try and assign Facebook responsibility, which is debatable, then I would blame them for the false accounts and real accounts that were taken over. It's simply wrong and should not be tolerated in any circumstance.

Separately from morality, I would argue it negatively impact Facebook's survivability. If your _social_ platform cannot be trusted to honestly deliver it's only product, they will surely die.


There are costs to everything you do or don't do.

If your argument is that censorship prevents genocide, i would point tou to nazi germany, or soviet russia.


How did you prevent government genocides in the past when they used newspapers or TV?


I don't quite understand your argument; Facebook is a private corporation, and can regulate speech in any way they choose. It's not "totalitarianism 101"; if anything, it's a libertarian wet-dream of a corporation exercising their rights as they see fit.


Honestly, the much harder problem is convincing people that fact checkers are right, if they already disagree with the fact checker. And they are much more numerous than those who already happened to agree with the fact checker.


In my experience the biggest problem with fact checkers is that facts are complicated.

Somebody says "the speed of light is 300,000 km/s" and you go to a fact checker and you get:

> "TRUE: The speed of light is approximately 300,000 km/s."

Then you go to another one and you get:

> "FALSE: the speed of light is actually 299,792,458 m/s in a vacuum and has a medium-dependent value through air, water or another medium ... "

Then you get into an argument with someone where the distinction matters and they cite the first one, or get into one where it doesn't and they cite the second one.


Trying to overcompensate for bias is how we got into this mess in the first place. Rather than picking a set of biases and viewpoints and presenting that, news orgs and silicon valley have decided that $thing_i_agree_with and $the_opposite_view deserve equal reach/airtime.


I totally expect that Facebook will just pick biased "fact-checkers" on both sides of the aisle and call it done.

Google has over 150 organizations that are allowed to "fact check" through their API: https://www.blog.google/products/search/fact-check-now-avail...


Feel free to look at the list of fact-checkers that Facebook uses for any particular country, if you're so inclined:

https://www.facebook.com/help/publisher/182222309230722


[flagged]


The ministry of truth funded by a global oligarch who is famous for breaking the Bank of England. Wake up people!


It's a journalism startup much like ProPublica, i. e. taking on long-term projects with in-depth reporting.

And take your anti-semitism elsewhere.


> And take your anti-semitism elsewhere.

Which part of 08-15's comment are you referring to with this statement?


The part he disagreed with


Herbert Marcuse picked up on this idea that mass media broadcasts any opinion no matter how stupid and misinformed in 1965, and it's only gotten worse:

>Within the affluent democracy, the affluent discussion prevails, and within the established framework, it is tolerant to a large extent. All points of view can be heard: the Communist and the Fascist, the Left and the Right, the white and the Negro, the crusaders for armament and for disarmament. Moreover, in endlessly dragging debates over the media, the stupid opinion is treated with the same respect as the intelligent one, the misinformed may talk as long as the informed, and propaganda rides along with education, truth with falsehood. This pure toleration of sense and nonsense is justified by the democratic argument that nobody, neither group nor individual, is in possession of the truth and capable of defining what is right and wrong, good and bad. Therefore, all contesting opinions must be submitted to 'the people' for its deliberation and choice. But I have already suggested that the democratic argument implies a necessary condition, namely, that the people must be capable of deliberating and choosing on the basis of knowledge, that they must have access to authentic information, and that, on this. basis, their evaluation must be the result of autonomous thought.

(From Repressive Tolerance, 1965)


At the same time, sometimes the informed are wrong because of fundamentally flawed assumptions. Most of science at one point thought the world was flat. More recently, the mainstream experts have been wrong on:

- the superiority of American mass motorization

- the environment

- the drug epidemic

- the economic sustainability of a housing-investment-centered economy

So on and so forth. Democracy is great because it allows some level of correction if elites are horribly misguided, even if it can be too late. But you don't see misguided policies rapidly killing millions of domestic innocents because of a lack of feedback: https://en.wikipedia.org/wiki/Four_Pests_Campaign And we certainly have not come up with workable alternatives that work better.


  Most of science at one point thought the world was flat. 
Generally, none of science thought the Earth flat. Prehistoric creation stories lacked science altogether.

The flatness that some in Abrahamic traditions clung to was partially motivated by apparent conflict with scripture, such as explaining the stoppage of the sun miracle.


>Rather than picking a set of biases and viewpoints and presenting that, news orgs and silicon valley have decided that $thing_i_agree_with and $the_opposite_view deserve equal reach/airtime

No, we do that because a marketplace of subjective ideas is a good thing.


Except a lot of the "ideas" aren't subjective.

The most obvious case would be climate change. Regularly, climate skeptics are given the same amount of airtime as climate scientists, despite the fact that the evidence and opinion is overwhelmingly in support of climate change being a real problem.

A realistic panel would be 98 climate scientists on one side, and two climate skeptics on the other.

By giving someone with far less evidence equal airtime and respect, it lends credence to that point of view despite it being so poorly supported by facts.

This is most definitely not a good thing.


I had a wonderful chemistry teacher once. Someone asked him what he thought about climate change, and he gave a fantastic response.

"The atmosphere is a lot like a fish tank full of water. If you put food coloring in that fish tank, it turns slightly red. The more food coloring you put in that tank, the more red it gets. If you take a flashlight and try and shine light through one side of the fishtank and out the other side, the food coloring blocks some of the light. The more food coloring you put in, the more light it blocks... and turns to heat. Try it. You'll make the water warmer.

"That water is like our atmosphere. That food coloring is like CO2. And that flashlight is like our sun."

It's not difficult to convincingly and simply explain the evidence for man-made climate change. Evenly matched debaters would predictably win and lose if they were told to argue for or against climate change. The reason why climate scientists fail to convince TV viewers about climate change is that they are scientists: they spend more time practicing science than practicing persuading laypeople in terms that they will understand. Going up against demagogues, they have zero chance.

The game isn't bad. It works incredibly well most of the time for most of the problems we have encountered. It doesn't work so well when one side is incredibly bad at convincing common people and is not willing to change argument tactics, or when both sides are making completely subjective arguments.


I wish it was a matter of outreach, but I I fear people won’t remember your wonderful analogy unless they identify as a member of your tribe. Instead, putting skeptics on TV who are a member of their tribe, and who assured them it is nothing to worry about, allows them to carry on mining coal and driving their SUVs where they need to go and not updating any of their beliefs. All those things are expensive and painful for adults, but weren’t problems for you as a child.


I heard that story in college. It stuck out because it was pithy and convincing.

If you think people don't listen to new ideas or change their minds, do you think people should only be allowed to hear what you think is true, so that they don't hear wrong ideas to begin with?


Nope, I’m just disillusioned that everyone is as open as college you to new ideas and that we can simply go reach the people who don’t want to be reached.

I suspect that the fact that you were taking chemistry in college means you were already in a very different state to receive new ideas compared with your average TV news viewer who is currently skeptical of climate change, and that difference is really important.

I’m not saying let’s throw up our hands, but people like Gore and Billy Nye are trying outreach, and it isn’t because they are bad at articulating ideas that this is still a problem.


I once saw an interview conducted by I believe Tucker Carlson with Bill Nye regarding climate change. It was obviously conducted entirely in bad faith and was just painful to watch. Every time Bill Nye would begin to answer a question the host would cut him off with random nonsense like “Isn’t it true Mr. Nye that you’re not even a scientist?!” and the process repeated until completion. It was obvious that it was pushing a completely biased tribal narrative and was not concerned with logical analysis and discussion. And yet sadly the video was uploaded with a title along the lines of “Tucker Carlson Destroys Bill Nye” with a majority of the comments disparaging Mr. Nye. So I’d have to agree with you that there are definitely a non-trivial amount of the populous that cannot be swayed by facts or discussion and instead stick blindly to their “team” of choice.


  people like Gore and Billy Nye are trying outreach
Which becomes part of the problem. When your role model is a guy who flies private jets from conference to conference and had a $17,000+ per year utility bill for his home (and not so much as one solar panel when in office), it doesn't convey urgency very well.


No one can pass the purity test for skeptics because it isn’t about that. I’m convinced it’s about saving face for their tribe. I can’t think of a spokesperson skeptics would believe. Maybe Donald Trump saying he was wrong and something needs to be done on National TV would work?


> despite the fact that the evidence and opinion is overwhelmingly in support of climate change being a real problem

I still can’t convince many of my left leaning friends we should be using nuclear power despite the clear risks we face from climate change.


> The most obvious case would be climate change. Regularly, climate skeptics are given the same amount of airtime as climate scientists, despite the fact that the evidence and opinion is overwhelmingly in support of climate change being a real problem.

Is this actually a thing? As far as I can tell what actually happens is that outlets like Fox News just pretend climate change doesn't exist. They don't host a panel on it with climate skeptics, they just avoid discussing it at all, or center discussions around ad hominem attacks on the character or financial/political incentives of anyone who proposes doing anything about it.

Granted it's hard for them to discuss anything at all now that cable news airtime is 89% commercials.


if this person with less evidence refutes the evidence of the other, who has more evidence? Who is to decide who's evidence is valid before the debate? if it's already invalidated, why even have the debate in the first place? If you want a debate all participants should have a chance to voice their views, regardless of what the other participants hold as opinions about these views. if something is poorly supprted or not is to be decided after the debate. so basing the voiced opinions on preconceived conclusions is actually not a good thing which is what you seem to be suggesting.


An accurate presentation of the facts very rarely looks like presenting precisely two opposing sides as having equally valid ideas, and then refusing to contextualise the debate that you've artifically created in doing so.


A presentation is subjective. Unless you are literally showing people recorded values and scientific study methodology verbatim, it is subjective and open to interpretation to some degree.


Of course a presentation is subjective. There are, however, significantly better ways of presenting a debate over something than finding the one person who holds the exact opposite of the current mainstream view and bringing them on radio or television or giving them opinion columns in the newspaper, repeatedly, until they manage to whip up a crowd who are angry enough at the state of the world and want someone to blame to believe them.


Yes I think the cons of this outweigh the pros. Ultimately, people will stop getting their information on FB if it becomes too obvious it is being skewed unfairly.


Only people who don't share that bias would leave.


If this were true, Fox news would long ago have disappeared.


What if one "side" traffics in far more falsehoods than the other "side"? An unbiased approach will fact-check them far more often.


Considering "sides" is itself an a priori bias. Articles can stand or fall on their verifiability.

That the current soup of deliberate misinformation happens to be played more by the loony right than the loony left is an artefact of those misinformers.


I really don't think you should remove information (or even be able to sue someone, except for defamation) for it being false, because it gives people the false impression that if information is printed or allowed on Facebook that it's probably accurate. We need to give everyone the impression that whatever they read might be false, because it actually might be false. Maybe that will make people think critically about what they read.


Or you could use your critical thinking skills and keep in mind that they're not going to catch everything. This falls in line with the mentality of "don't let perfectionism get in the way of progress".

We need to focus more on teaching critical thinking in schools, it will lead to people who can call out bullshit way more reliably.


How do you think people practically learn critical thinking skills? Occasionally people need to be fooled, and shown that they were fooled.


I think education in critical thinking would solve the problem, but I also think digital media literacy would help too and is probably an easier lesson to scale.


An ability to trace the sources of the posts might be interesting. And connecting that to discourse about the typical accuracy of those sources.

"This content paid for by... here's the other content they paid for/posted, the trend of topics they discuss, and a weighted bullshit meter"

But Facebook plainly wants to moderate the narratives for themselves; until recently their entire news feed was human curated. Now they're just going to keep it behind the scenes, use some algorithms, and offer pay to play, so yeah, they'll get right on doing something that offers visibility.


I think that leaving information up on the website, but make plainly visible the sponsor (and a list of other items sponsored), would be a decent feature that Facebook would absolutely never implement.


That is a reason against facebooks business model.


Journalism requires citations. Otherwise it's just gossip.


It was clear the internet was full of lies long before the aol dial up days. If the message hasn’t sunk in by now I think people aren’t going to get it, ever. I’m not sure there’s even a positive way out of this for Facebook.


There are always new people learning old things about the Internet. Much like cleaning, you will never get to 100%, but without effort the problem will be far worse.


Agreed. Critical thinking is becoming a lost art that we should not use "workarounds".

Also, who gets to decide what is true? Silicon Valley tech employees lean overwhelming to the left. How do you remove this bias?

But really, this is a problem of society in general, not a tech problem. People are trained to take what they here with a grain of salt. Having official fact checkers lends authority to what is on Facebook which may or may not be true based solely on whether it was flagged / censored.


This logic would also have us stop prosecuting pickpockets on subways, for fear that people might stop being careful with their belongings.

Private social media platforms should both take steps to remove obviously false material and educate users as to how to recognize misinformation.


I don't know what sort of people you talk to, but I don't treat legal behavior in anywhere near a similar way to the way I treat illegal or directly harmful behavior. The idea of even making such an analogy feels wrong to me.

Pickpocketing physically damages you and your property. To draw a parallel with speech stretches my imagination too thin.


Conspiracy theories that circulate on social media are causing damage to our democracy.

Facebook is a private platform. It's not, and it shouldn't be, bound by the First Amendment.


This. Assume everything is false until proven otherwise.


I don't go that far. I tend to look at a few things:

* what is the source? do they have a reputable history?

* do they have conflicts of interest?

* do they have a history of faithfully printing retractions and admitting mistakes, or a history of exaggerating facts?

* do they have a history of writing accurately about subjects I am an expert about? (e.g. subfields of computer science)

If they score pretty well, I tend to believe in the facts (and evaluate the interpretation) they espouse until I come across something that contradicts it. So far the only source I regularly trust is the Economist, but I haven't researched the history of many news organizations. It's just a starting point for me.


So every time you let your spouse take your car you do a DNA test to make sure they haven't been switched with a look-alike?

That's just a somewhat extreme illustration of how you, and everybody else, has to rely on heuristics to evaluate statements. It's impossible to even prove a tenth of percentage points of the facts you rely on daily from first principle.


“Believe none of what you hear, and only half of what you see”


That is terribly inefficient.


That is very effective.


Not really. There is a whole host of information that you'd never be able to operate on, mainly because you as a person do not have the time or expertise to discern what's true and what's not on every subject that comes up.


information or dis-information?

"Lies spread around the world before the truth laces up it's boots" according to Mark Twain ... and that was before Radio, TV or the Internet.

Hard to inoculate your population when they're swimming in toxic filth daily.


The upcoming US elections may prove to be a watershed moment for modern democracies in general. There is a widely held belief in academic cycles that mass voter manipulation through social media has managed to trump all other kinds of propaganda, causing people to become electoral zombies. This belief is based on flimsy evidence and on a mountain of bias. I hope facebook and the other media do everything by the book and according to the mandates of their favorite party politics. This will be an easy way to finally dissolve these mass delusions and make it obvious that societal change is real.


I'm not sure what this comment has to do with the article. This is about targeted ads telling people they could vote by text message or that their polling place isn't safe. What does that have to do with "mass delusions" and "academic cycles"?


The connection seems reasonable enough to me: the social media platforms take it upon themselves to arbitrate truth - and fine, that's their prerogative.

The falsehoods at issue for this very moment are about election day logistics, but the longer term narrative of "fake news fools mindless voters into making bad decisions" has also repeatedly been on these same radars in the same efforts of these same companies.


it is a bit meta, because i didnt have anything to comment on facebook's move, it seems rather obvious, (although it would probably be easier for them as company to ban political advertising altogether this time).


Instead of mass voter manipulation of voters through the establishment media? Nobody would be in a panic at all if Hillary had won.


lolololololololololololol - people have been panicking for 26 years at the thought of Hillary Clinton being president.


America had to pick between an idiot, a criminal, and a socialist. They literally had no one to vote for. The country would be upset regardless of who won.


Just like nobody was in a panic when Obama won?

Have we forgotten what 2008-2017 was like, or do we just pretend that the republican fringe collectively losing its mind over a black man in the white house didn't happen?


Looking from the other side of the Atlantic, the difference is stark. Obama's were the first campaigns widely lauded for their aggressive use of social media. Nobody made a fuss about it, facebook didn't change a single rule because of somebody complaining or anything. The lunatics waving birth certificates were seen by us as some typical American oddity. This time it is different. Our facebook accounts have to be verified, our pages and apps vetted, comments are blocked or filtered, the words "fake news", "russian trolls", russian propaganda etc have become national issues even in countries which never face such a thing. There have been countless elections and referenda throughout the world during the social media age, but the "election phobia" and the idea that all elections are inherently tainted because people can't keep themselves from voting the wrong person/thing was born after the US election.


The issue with Facebook is that it's too easy for someone from Russia for example to place ads. I believe the established media doesn't run thousands of ads paid by thousands of unknown individuals.


I think it was just recently that China placed ads in the Des Moines Register, with reports of similar activity on the Times etc.

https://www.desmoinesregister.com/story/money/agriculture/20...


At least in that case Des Moines knew who paid for the ad. On FB there is no identity check(unless you consider credit/debit cards identity proof). Not to mention that you compare one ad with thousands on facebook. Facebook can't stop them even if it wants. The main reason is that the ads are too cheap to investigate properly. A bad actor can use multiple accounts. And when I say multiple I mean as many it wants. Times/Des Mines would not publish 100000 ads of 1$ each, each about Trump's policy(more or less) but facebook would do it.


The academics are wrong, the problem is tribalism. Facts don't matter, thought doesn't matter, even ideology doesn't matter. It's just about tribes at this point for a decent chunk of the population.

Now, social media is an excellent platform for whipping up further tribalism. But it's not so straightforward as "a few cambridge analytica ad buys -> electoral zombies".


Wait a sec! There is nothing wrong with tribalism per se. The problem is that fake news are convincing people that my tribe is wrong, which is obviously nonsense!

/s


> This will be an easy way to finally dissolve these mass delusions and make it obvious that societal change is real.

How so? If the democrats win, the republicans will be able to claim that Silicon Valley is manipulating the voters by deplatforming conservatives. If the republicans win, the democrats will be able to claim more Russian bot/fake news manipulation.


my assumption is that you can fool people one time. The republicans may get a chance to blame 'deplatforming' . But democrats blaming again the russians when facebook is vigilant and the whole world is watching closely .... hard to be convincing.


>This belief is based on flimsy evidence and on a mountain of bias...

Isn't that pretty much all political beliefs? Facebook is not so different from any other individual or organization in this regard.


i mean the academics, and i dont mean their political belief, but their conviction that the "fake news/russia/misinformation" complex is solely responsible for the trump election.


It seems that Trump being in power has been a negative force for Facebook. I wonder if there is a legitimate business incentive for the company to manipulate the distributed information to provide an anti-conservative bias


are you implying that a social media company known to manipulate the mental state of its users might want to influence the people who vote for the politicians that talk about Facebook like a monopoly and want to break it up? that's unbelievable


[deleted]


neither of these sources are peer-reviewed.


They spent over $100,000 on ads during the 2016 campaign. You can't deny that the voters were manipulated!

/s


Plus hundreds of people working full-time with a budget of at least $1.25M/month (which goes further in a country where the average income is around a sixth of the US).

https://assets.documentcloud.org/documents/4380504/The-Speci...

The big boost, of course, was the email hacking and successful use of their media outlets like RT & Wikileaks to successfully spin nothing into the impression of a scandal, which people are still credulously recirculating.

You can still argue that this was not significant compared to campaign spending or the larger PAC/media battle, but you don’t need to be donating your PR services to a foreign intelligence operation.


The scale of people affected went beyond ads - there was a lot of group creation and stirring controversy. They also got Americans to organize protests, I think the full reach was around 126 million people.

"When Facebook first acknowledged last year the Russian intrusion on its platform, it seemed modest in scale. The $100,000 spent on ads was a trivial sum compared with the tens of millions spent on Facebook by both the Trump and Clinton campaigns.

But it quickly became clear that the Russians had used a different model for their influence campaign: posting inflammatory messages and relying on free, viral spread. Even by the vertiginous standards of social media, the reach of their effort was impressive: 2,700 fake Facebook accounts, 80,000 posts, many of them elaborate images with catchy slogans, and an eventual audience of 126 million Americans on Facebook alone. That was not far short of the 137 million people who would vote in the 2016 presidential election."

https://www.nytimes.com/interactive/2018/09/20/us/politics/r...


  the full reach was around 126 million
Wow, controlling 1,260 people per dollar spent is quite the ROI.


The ban on misinformation is only on information regarding how you can vote, which seems reasonable since it is something that can be fairly easily fact-checked - either you can vote over mail, or you can't, there isn't an in-between. However I think it would be better if they didn't try and limit the spread of false articles - instead maybe they could put some kind of disclaimer on the post, saying "Facebook fact-checkers believe this to be false" or something like that. Then the actual distribution of the content isn't up to the fact-checkers.

Obviously Facebook can do what they want, they are a private company, but I think it is best if they treat everyone's opinion/articles equally as much as possible, so long as they are not violating the content guidelines on the website.


"Ban misinformation"? This is a very slippery slope... Facebook's very own Babel's Tower.


It's great that FB can even try and I'm sure they will improve things somehow.

My concern is apps like WhatsApp which can't peak into conversations.

We're going through elections in Brazil and the amount of fake news I receive daily from family and friends is shocking . Some days I try to be a fact-checker to send a message that they should know better but it only creates anger (and I never take a position , simply show the facts. I'm sure they catalog me as being a member of whatever other side they hate).

People only want to read/see things that agree with their views to the point that they are happy to forward pure crap to others.


I'm not particularly happy that social media companies are deciding on tripling down on shady behaviour. They're always heavy-handed and go around banning, silencing and censoring even unrelated stuff. It's gotten to a point where people in my circle of friends are moving on to alternatives because they had enough of dealing with their private communications being scrutinised, dissected and used as a reason to censor them by very fallible bots. I understand the challenges and issues involved in this whole kerfuffle --- but in the end if your users are unhappy and leaving your service, you're failing on delivering on your promises.

I wish I was smart enough to think up an instant solution to this, but I'm not. All I know is that there has to be a better way out of this than just a black and white solution. Society should be able to develop a way to integrate social media into itself and use it non-destructively.


Soon you'll see this linked from a Facebook post to prove that you can trust information posted on Facebook.

After all, if it was false they'd remove it!


Roads have speed limits. Social media can have link limits.

Sure, you might be able to exceed your limit by posting to something that is patently or opinionatedly false. But how would you enforce it if a person can simply say "oops sorry I thought you knew it was a joke" or "I agree with its opinion"?

Analogy: if a tree falls in the woods, does it make a sound?

If you only violate a speed limit when you're caught and the police write a ticket, how would you get caught on social media?


In order to prevent virality on WeChat, they impose a 100 share limit before limiting new shares to invites-only and only for people who have a linked mobile number. What this effectively means is that if WeChat detects a 'viral-event', anonymity can be squashed and those responsible for causing the viral-event can be identified and I suppose questioned at a later date on whether they are anti-government or simply rabble-rousers.

Is this what you mean by wanting to have social media limits? That the government can step in and say, "Ma'am, you've shared enough, what's going on here, you're going to need a permit for this discussion" ?


Well, its not a discussion after hundreds of strangers have been pulled in. Its more of a mob? Its disingenuous to talk about free speech on the web, deliberately ignoring how virality is a new thing and its not working out well to ignore what it can do


> Is this what you mean by wanting to have social media limits?

I wasn't advocating for social media limits. Rather, I was trying to present one method of limiting and contrast with difficulties of enforcing it.

How, then, can you "ban" misinformation if the person spreading a link doesn't realize it was incorrect?

How can you prevent that same mechanism from censoring satire? How can you prevent that same mechanism from censoring honest opinions rather than facts? How can you prevent that same mechanism from censoring discussions?


Ah, I apologize, I attributed semi-advocacy where there was more like devil's-advocacy, and its too late for me to edit my post.

Maybe you don't ban anything at all? Maybe above an item that meets viral criteria (or as a click-thru to the post to increase content friction), a viral-disclaimer is put front and center: "This information has not yet been verified but our algorithms have detected that many organizations and individuals have been sharing this".

Don't ban, just discourage via technical means? Like how many companies use the dark-pattern of making it unbearable, but not impossible, to go through the process of unsubscribing or cancelling an account. Whenever anomalous unsourced information appears to be spreading, just increase the consumption friction. Make viral-response a controls problem. If that sounds like a typical engineer response, yeah, sorry, not sorry.

You want the viral response to be critically damped, some arbitrary compromise between stifling information flow and having a situation like in India where some remote villagers killed traveling strangers for being in the wrong place at the wrong time during a viral WhatsApp misinformation situation about child-kidnappings. https://www.bbc.com/news/world-asia-india-44856910


It's just a painfully dumb metaphor, don't overthink it.


Limiting to 100 is still a large exponential. In just 5 'levels' of share, it can reach about 300 million people.


I think 100 refers to total 100 shares of an original story, not 100 levels of share.

(Example: You can hit 100 shares in 2 levels of share. You have 100 friends share your story, each share reaches their 100 friends. That's only 10,000 people)


They really should just disallow all posting from Nov 1 until after the elections. It would be much more effective


Here's the official Facebook post about it: https://newsroom.fb.com/news/2018/10/voter-suppression-polic...


Facebook needs to be regulated or the US will become the next China. Once the other tech companies see that they can censor political views they disagree with, they will and the US will have what China has without even the state mandating it.


Facebook and Twitter are aggressively censoring alternative media and news sites that cover police brutality, political opinion and corruption. [1]

It's very strange to see these arbitrary and heavy handed actions not receiving the same outrage as possible censorship by Google in China, when they are in effect the same thing.

Clamping down on political opinion and activism ie dissent is censorship. The chinese do it directly without apology, we are using sophistry by using nebulous ministry of truth terms like 'fakenews', conspiracy and FUD to achieve the same ends.

[1] https://www.nakedcapitalism.com/2018/10/tide-turning-regulat...


Just in time for Zuckerberg's presidential campaign!


This is such a slippery slope. Even now snopes hasn’t debunked the “Trump tower server communicating to Russia”[1] story even though it was simply spam and is soundly debunked.

And when we learned that or national intelligence agencies wire tapped a duly elected president? CNN called it fake news (“we’re dumber for having heard that”).

If we let these “fact checkers” control the narrative we’re sacrificing our own ability to critically judge things for ourselves. Because we’ll never see it.

I have zero doubt that “misinformation” is simply — or already — overloaded with political biases.

1. https://www.snopes.com/fact-check/trump-server-tied-to-russi...


> If we let these “fact checkers” control the narrative we’re sacrificing our own ability to critically judge things for ourselves.

Give me a break. We sacrificed our ability to critically judge things for ourselves the second we decided we're ok with algorithmic feeds and timelines. Adding fact checkers only changes some input variables - the net result is still that each social media user lives in a thought prison of their own design.


> Months ago, senior Facebook executives briefly debated banning all political ads, which produce less than 5 percent of the company’s revenue, sources said. The company rejected that because product managers were loath to leave advertising dollars on the table...

Gotta remember what's most important. /s


Do you have a problem with political ads on tv? From my perspective we should have campaign finance laws that regulate political ad spending, but there's nothing inherently wrong with showing political ads on tv, facebook, or elsewhere.


> ... there's nothing inherently wrong with showing political ads on tv, facebook, or elsewhere.

As has been pointed out elsewhere, Facebook is different than television because ads can be targeted extremely narrowly. This makes them hard to audit and possible to use ads for things like voter suppression of populations that are likely to vote a certain way.

Advocating for/against a specific policy or politician is one things. Attempting to disenfranchise a group of people through misinformation is another.


Not the person you asked, but yes, I do have a problem with that. Luckily, it's not allowed in most of Europe.


5% is a lot. Political ads are 5% of Facebook's revenue? Wow that's way more than I expected.


They have a fiduciary duty to their investors to not leave money on the table... blah blah blah.

There is a lot of cheerleading of bloodthirsty capitalism in the hallowed halls of CNBC and the like.


They are a business, after all.


> They are a business, after all.

Just like a cigarette company that doesn't want to leave money on the table from not courting youth sales.

It's a twisted idea that the logic of business success can be used to justify all kinds of destructive and harmful decisions by businesses.


Agreed, but the idea that any publicly traded company can or will have ethics is an equally absurd idea. I tend to think of companies as Darwinian machines whose evolutionary 'fitness function' is the maximization of profits.


There are both business and public policy arguments for not banning political ads, as the article notes.


Well, by not "leaving money on the table" they now have an existential crisis.


This is something that seriously needs to be clamped down on. It is outright fraud, plain and simple. And since it's being done with the intention of disenfranchising people, it is extremely evil.


Fraud requires financial gain. This is just lying. People are allowed to lie in this way, and you should not believe everything you read, particularly if it sounds very appealing, gets your emotions fired up, or seems to offer benefit without requiring any payment or responsibility.


A lot of these fake news sites don’t care about the truth. They’re all about generating clicks to their sites which produce revenue.


I don't see how that's relevant.


"People are allowed to lie in this way"

No. I flat out reject the idea that people should be able to lie about the facts of when and where voting occurs for the purposes of disenfranchisement. That should be severely punished. That is not freedom of speech, that is again outright fraud.


Do we know who is engaging in this behavior? I'd like to know who they are so we can make sure they don't try to do something similar again.


I don't have links but the Russians were definitely targeting communities in efforts to reduce turnout during the 2016 election. Reading about the scale and intricacy of their influence campaign is pretty fascinating and worth a read.


[flagged]


You're absolutely right... "dnjdirjrmrf"...

Objective reality doesn't exist, and there is no such thing as either misinformation or disinformation, and we should be frightened of anyone who tries to distinguish between them. Truth is lies and lies are truth.


They'll most likely filter out anything conservative.


Why do you think that might be?


Are you insinuating conservatives are disproportionately creating fake news?

Silicon Valley's left wing bias is well established. There's no way companies like Facebook can censor impartially.


Are conservatives disproportionately posting false information about voting requirements, voting methods, long lines at polling places, and violence at polling places?

That's what the article is about filtering, so if conservatives aren't doing that more than liberals, their posts won't be hit more.


I really think they have better things to worry about. Maybe fixing their news feed so that I don't see the same four pieces of content day after day after day. With a few hundred contacts on Facebook that aren't unfollowed, I should be seeing something, but it's not surfacing at all. Even if I turn on notifications for particular people that I really care about seeing content from, about half the time I don't see things. Really makes it pointless to bother with Facebook very much.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: