The problem faced is unprecedented soapboxing. Insane ideas have been around as long as humanity. Usually you stand on a soapbox and preach, people ignore you. Some might even disuade others from listening.
The problem with online opinions is that while the "opinions are like assholes, everyone's got one" adage is true, online nobody is there to tell you "stop watching this stupid thing". And we all know YT comments are a heap of garbage. Also YT/Instagram comments are fully moderated by the author, so removing all disagreements is easy. Easy to give the appearance of consensus.
And worse, once you watch one, YT's algos want to keep you watching so suddenly you are flooded with related vides.
I've found it almost impossible to remove a specific feed or ideology from my YT recommendations. As an example I once watched Yanis Varoufakis speak at the Oxford Union. He has appeared on panels with Slavoj Zizek who is a person I personally can't stomach. Almost immediately after the Yanis video I had YT recommend me to watch a panel with Zizek and Varoufakis discuss in 2016. I didn't watch it - but since then I get Zizek constantly recommended in my feed.
Note I don't use google for searches and my YT is running pretty much isolated in Chrome while I use Firefox/Tor for all other browsing. So google doesn't know me and YT knows me only via my handle that is otherwise not connected to anything (outside YT).
I have countless examples like this. YT keeps pushing me down a specific ideological path without any action on my part. Sometimes it's great but in those cases where it's not it's not just bad but outright dangerous. I don't think YT has the effect on my since I'm way past a point (age) where my behavior can easily be formed. But I can see how this sends somebody over the cliff who is still figuring out who they are.
People like rockstar2016 (aka the MAGA killer) or the woman who went into Youtube HQ and starting shooting the place up were probably a good examples of how social media can radicalize those that are the most vulnerable.
There's been an uptick in conspiracy theories lately, I kinda blame the whole "mainstream media lies" movement. Some people are starting to believe that if almost everyone believes something, it has to be a conspiracy.
My favourite one right now is "5G is bad and will kill us all and microwave birds". There is not a single article on 5G without the Facebook comments being full of people shouting about the extreme dangers (even when it's just about enabling the fake 5G on frequencies previously used for LTE). No matter what you say, they will keep linking you vague YouTube videos by people who claim to be doctors and misinterpreted documents from the 80s. I don't get it at all.
It's very attractive in an unpredictable world if you're feeling vulnerable, scared, disenfranchised. Someone out there has the answer, and oh man it's great because someone's answer are these things I'm inclined to belive and reinforces / validates my values.
the whole 5G radiation conspiracy thing has been going for a long time. I think it's as old as mobile technology itself. Surprising they haven't blamed RF for the disappearance of insects yet. Fwiw probably healthy to be skeptical about blanket statements claiming RF has 0 risks (they do cause brain tumors in rats).
Off-topic: David Dees has some funny (to me hilarious) artwork on his site that describes these conspiracy theories better than words: http://ddees.com/new-art/
The ones about "blockchain" and 5G are so friggin good I've set them as desktop wallpapers :)
Your premise is reasonable, but the "mainstream media lies" movement is rooted firmly in reality, so root cause analysis points elsewhere.
From the Russian collusion narrative, to the Avenatti client claims, to the Covington school coverage, to the recent Jesse Smolett coverage. Western media increasingly does not bother to check facts before running with stories.
It's fair to blame the "msm lies movement", but that movement itself has a root cause.
Our society is premised on the idea of free exchange of ideas. Some will freely choose to think/do/believe things we find insane (some cults/religions/jobs).
The premise of this article is that these videos are going to brainwash/hypnotize poor fools into believing crazy things.
Newspapers need to win not be being the only voice in the room, but by being the most relevant, clear, researched, reliable, unbiased voice in the room.
The problem is that a very large percentage of people apparently don't really want "relevant, clear, researched, reliable, unbiased", they want to be told that they're the good guys, that all their problems are someone else's fault, and that they don't have to make any changes in their lives.
And social media giving these people what they want creates a vicious circle that does real damage to society as a whole.
A platitude like "free exchange of ideas" does not address this problem.
Why do you think that people will automatically listen to the most relevant, clear, researched, reliable, unbiased voice? Lots of people end up listening to loud, angry, biased voices that say a lot of bullshit. We see this with things like the anti-Vax movement.
1. The law isn't involved in this, this is Youtube changing things.
2. The law gets involved with factual stuff all the time. You can't sell pills and make certain claims about their health effects, you can't print a newspaper and slander people, etc.
So both things are true.. our society is based on the idea of free exchange of ideas, AND people are susceptible to disinformation.
You seem to be under the impression that the truth will win out based simply on the virtue of being true, but that is simply not how humans work. Disinformation spreads because it can be crafted specifically to take advantage of the flaws in how humans think and decide truth. We can't just bury our head in the sand and think this is not a problem.
I am not, in any way, arguing for censorship of false information. However, I do think we need to be aware of the threat disinformation plays, and work on strategies to help inoculate our population against it. It is a false dichotomy to think our only two choices are 'ignore disinformation' or 'censor it'. We can do other things.
It is dangerous to ignore it, or to act like it isn't a threat . While I do believe that the overall arc of history is towards more understanding and truth, it is foolhardy to assume this path just happens naturally. It happens because people pay attention to threats against the truth and respond in ways to counteract it.
Yes, but the only way to keep that freedom is to use it responsibly. In democracy, you're entrusting the people with control of the government. This is why we have public education, to teach people our principles and responsible ways to exercise their freedom. Some people will find their own way and go outside the lines of respectability, sometimes for good, sometimes for ill. Society depends on most people not doing this though. Take guns for example. We have very permissive gun laws, which is nice as long as people don't use them to shoot up schools for example.
It used to be that while we had free speech, publishing companies had standards of respectability. Newspapers weren't going to print smut even if it would sell more papers for example.
The problem with YouTube is that it's basically a newspaper editor with no ethics, it only cares about selling papers/maximizing on screen time. My son has recently been into watching videos of the moon landings/the lunar rover. If I leave YouTube running, the recommender will automatically start playing moon hoax conspiracy videos. What's worse is that it uses this same asinine recommender algorithm that tries to get me to watch as much YouTube as possible even though I pay them $18 a month. Given that they already have my money and aren't showing me ads, I would appreciate it if the recommender had my best interests at heart rather than just showing me videos that an algorithm thinks will addict me.
That might be how newspapers win on quality, but it is not how they will win eyeballs. Clickbait works. Low quality content sells volume. Biased sources are extremely popular because and not in spite of their bias. The fundamental problem here isn't the lack of quality content, it's that we're deeply flawed and easily hackable creatures.
Newspapers need to win not be being the only voice in the room, but by being the most relevant, clear, researched, reliable, unbiased voice in the room.
But being relevant, clear, researched, reliable and unbiased doesn't get as much attention as feeding people's biases and fears and all the rest of that. Is it possible to win if all you have to offer is relevance, clarity, research, reliability and lack of bias?
But is this really the case? For decades there were things like the National Enquirer and other such rags sitting by the checkout counter with all sorts of sensationalized, hyperbolic, and often fake articles. Yet people, somehow, understood that taking things from it as more than entertainment as not a great idea. I think a big part of the reason for this is because there were other, much more stable, sources of information that consistently provided reliable information in a clear, objective, and well considered manner.
Today it's increasingly difficult to point to any particularly stable source of information. The media has become ubiquitously driven by partisanship, agenda, and sensationalism. And reporting of falsifiable facts has been replaced with opinion, conjecture, speculation, and leading statements. And one of the biggest issues is that this is all happening during the era of the internet. In times past if you wanted to read older news articles you needed to go to a library and hope they had the paper scanned onto film, and then dig through the paper to try to find whatever you looking for. Now when a paper engages in shoddy reporting, it can be instantly referenced and used as an example to all.
The lowest common denominator always wins the short game in society. Offer somebody a sensational[ized] and poorly researched story, or a well researched, considered, and frankly somewhat boring story. The sensationalized story will almost always win in a one-off. It's fun to read light weight trash when you're just going for a taste. Who didn't occasionally flick through that National Enquirer at the checkout stand? It was fun! But in the long game, the trash loses. It may be fun to read a trashy story here and there, but it's not fun when that's all you have because it's ultimately meaningless.
The big problem is that I think the media was very late to react to the internet, and then overreacted by trying to compete with social media. And that inevitably meant becoming trash. I'm going to conclude with a very high hanging target, The Intercept. In my opinion The Intercept remains arguably the most reputable media outlet today. Nonetheless, especially in recent times have also started to go down the path of partisanship and it always ends the same way.
---
[1] - This is an archive of an article The Intercept ran some time back suggesting it was absurd and unimaginable for the president to suggest that some bomb threats against Jewish community centers might be driven by something other than anti-semitism. The outrage driven article focused on his suggestion that attacks can, in lieu of anti-semitism, sometimes by done because "Someone’s doing it to make others look bad." The paper suggested such thoughts were "white supremacist conspiracy theories."
[2] - This is an article The Intercept ran two days later after it was revealed that one of their former reporters had been the person executing the bomb threats. He was an extremely liberal black individual who reported almost exclusively on social justice and racial issues. He had been calling in the hoaxes in a convoluted attempt to frame his ex-girlfriend (whom had recently broken up with him) for calling in the bomb threats.
[3] - This is how the article was updated given #2.
Some of the conspiracy theories have been proven true already by Wikileaks and Edward Snowden. Just because something is a theory doesn't mean that it's true or false.
What we know already is that powerful organizations have secrets that they spend big money on protecting. The difference now is that with the internet the time for which a secret can be kept shortened significantly.
I have to wonder how much of the uptick in conspiracy theories and extreme perspectives are nation state actors attempting to sow discord. I don't think people need help being crazy, but social and new media are the equivalent of getting to drop millions of flyers into enemy territory for a small fee and little to no risk.
In the early days of the internet I thought all this "information" being available / soon to be available as a positive. I had no idea what the negatives would be at that point, let alone the scale.
This is like all those politicians and journalists lamenting the rise in violence, when both war and violent crime are actually at historically low levels. We are today in a golden age of science, where people's beliefs are on average the most fact-based they have ever been. You wouldn't believe the ridiculous things people believed in the past.
Yes, we have a long way to go and we should continue to seek improvement, but don't try to tell me that things are getting worse when that's clearly not the case. Or at least provide some evidence that's not anecdotal.
Annoyingly, conspiracy theorists are great consumers. They are primed to reject the status quo and try new things. Look at Alex Jones' merchandise. I am not saying that his supplements are snake oil, but he definitely profits from curating an audience that is skeptical or outright rejects modern medicine.
My favorite YouTube is the rainbow conspiracy it appears the person is not aware of the color spectrum.
That said filter bubbles and ai feeding us more what we “want” is probably increasing conspiracy theories and making us less tolerant of different views. There has been little talk of the morality of ai algorithms.
Personally I was recommended “hot” dating sites advertisement when having relationship issues. Not what I needed but surely what most likely pays the best. The issue is lack of ai moral or for that sake profit first. Obectivivly the ai was right but morally not I am a parent.
Ads probably should be regulated at least you should be able to say no to ai algorithm and behavioral psychologist maximizing profit for private companies.
The most relevant and interesting aspect of any conspiracy theory is the fears it exposes. Whether its fear of environmental degradation ( Chemtrails ), fear of loss of agency ( anti-vaccination, anything mentioning the Federal Reserve, most political conspiracy theories ), racial identity fears ( anti-semitism, "White Genocide", Islamophobia, etc. ).
One underappreciated aspect of the rise of conspiracy theories in the last 20 years is the fact that more people are insecure financially and have lost trust in institutions, often justifiably so. So conspiracy theories meet less resistance than they would have from people who are not worried about their survival in the face of a society that seems determined to immiserate and impoverish them if not destroy them.
Additionally we have a number of examples of institutions being subverted to malign ends, from Wells Fargo to Wikileaks, we consistently see dishonest behavior from organizations that claim to have the publics interest at heart...
I wonder if that is sometimes their intent. Sort of the "oh it's just a joke" situations you see, but the intent may be just normalization of it / a gateway to keep planting these ideas and doubts.
There was a quote on that effect (which I can't recall right now, sorry)—one from pre-WWII USA pro-nazi groups.
Not that anyone who uses the techniques are necessarily pro-nazi, but that it's a technique with an established history of [attempted] use, IIRC, in the popularization of presently unpopular ideas.
In my most cynical moments, I've wondered if the recent "growth" of extreme right-wing movements in the US, and the response, is a long game with regard to civil liberties.
The problem is that it's a direct result of their business model & environment. I would think they could fix it (or at least put a big dent in it) immediately if they ditched the recommendation engines, autoplay and other tricks to keep people watching. But they can't to do that, because of the choices they made 2-5 years ago.
Mostly because this hoopla makes it hard to monetize those videos. But even conspiracy theorists need toothpaste and skin cream. YouTube needs to create an education program for advertisers that people who consume these videos are people too. Maybe sell prepper supply ads.
Well, first off, YouTube is a private corporation, not subject to free-speech laws.
Second, while I agree that YouTube shouldn't ban people for believing in conspiratorial nonsense, even if I do think it's harmful, it's not unreasonable that their algorithm should bias things that are fact-based, or at least not contributing to a toxic platform. It's hard to argue that any good came from Alex Jones posting the addresses of the victims of the Sandy Hook shooting, and it doesn't create a fascist dystopia when YouTube doesn't want to promote something like that.
Also, I wouldn't be 100% sure that all the flat-earthers are trolling. There's an overwhelming amount of stupid people in the world, and while I will say that most of them are probably messing around, at least some are serious. At some level, I think it's far too easy for people to say "I'm just trolling" after the fact.
I make no argument against YouTube being basically a free-speech zone, and I wouldn't suggest banning people for purely ideological reasons, because I agree that that would be against the American Ideal.
That said, if I were hosting my own video-sharing website, there's no way in hell I would recommend a white-supremacist video, or even a flat-earth video. If people find it on their own, that's on them, but for the same reason that I personally wouldn't pass along flyers for thedailystormer, I wouldn't promote someone like Alex Jones.
For the same reason that the NY Times is not required to platform white supremacists, and the same reason that I don't have to host Nazi Propaganda on my website, it is part of YouTube's free speech to platform whomever they want.
It is impossible to say what people do or don't believe, but I worked with a guy who would argue the earth was flat. His massive knowledge into the conspiracies and complete lack of knowledge about physics was a constant issue.
Oh, we worked on a project computing link budgets for geo stationary communication satellites. So the company made all of their money off of satellites, and he could completely disconnect from that.
> it's interesting how newspapers are thumbing their noses at the first amendment and smearing competitors who also enjoy it's protection as they perhaps sense an opportunity at regaining gatekeeper status.
Isn't that the same stuff conspiracy theorists always say when someone points out their theory is BS?
Yes, that's the exact same line of argument I see from people touting inane conspiracy theories.
Unfortunately we see a lot of conspiracy theories have very real and damaging effects on people, often not those indulging in it. Just look at the idiot that attempted to burn down that pizza place because of pizzagate or how many people harassing families of school shooting victims because they believe they're all government orchestrated tragedies.
The first amendment is about protecting speech from the government, and since the Ny Times isn’t a government entity, the right to free speech doesn’t pertain.
Because people view it as harmful. If even one person thumbs their nose at science because some idiot on YouTube told them that gravity can't be real because "water always finds its level", that is causing harm. If even one person takes Alex Jones's ramblings about the Sandy Hook victims seriously, and perhaps starts sending texts threatening the victims (which actually happened), that is causing harm.
It's not hard to see why recommending false and/or misleading information is dangerous, and you'd have to be purposefully ignorant to not see that.
I don't think that the NY Times or the YouTube executives would ever advocate that you arrest people for spreading this misinformation, short of direct harassment, but they don't want to be party to the purposeful dumbening of society.
The thing you are not considering here is that in this world of censorship you now need a 'Ministry of Truth.' Let's consider Google and the NYTimes. The NYTimes was one of the loudest cheerleaders for the war in Iraq. They unquestionably ran stories advocating and spreading our likely fabricated "evidence" even though a critical examination would likely have revealed some serious questions, but they chose either not to investigate or not to ask those questions. They, years after the fact, wrote an article condemning their own lack of scrutiny [1]. But did they really learn anything? They returned to the exact same pattern of unquestioning cheerleading for war in Syria, and I predict we'll likely see something similar should our actions in Venezuela culminate in yet another invasion.
Google is a massive multinational corporation that profits by harvesting and exploiting personal individual on people. They have shown an eagerness to expand their operations to countries such as China. They planned to launch a censored search engine and were happy to agree to record and track individuals by tying their search to their phone number. Such behavior would enable convenient tracking and 'correction' by Chinese officials if they so desired. Among the list of terms they included in their prototype for China was literally "human rights." [2]. A company literally censoring information on "human rights" is like something out of bad dystopia fiction. This should not be reality - but it is.
I'm sure you see the point I'm making. By claiming that the average person is too stupid to be allowed to access whatever information they would like, you are implicitly asking somebody to be the gatekeeper of truth who decides true vs false, right vs wrong. But nobody is capable, let alone deserving, of this right. And the most ironic thing of all is that the typical gate keepers of truth that people would nominate are some of the worst actors in society that certainly have caused unimaginably more harm than all the total sum consequences of all the absurdity spread by individuals. And that's because when individuals spread fake stuff around, they might fool a few people but it mostly goes without notice. By contrast when the current gatekeepers spread misinformation, hundreds of thousands of people die at cost of literally trillions of dollars.
It's amazing how "the US government successfully misled everyone to invade Iraq" now justifies the world being flat.
This seems to be being mentioned more and more these days, too. It's as if it's the only thing people learned from the war, and they blame the media for it.
Sure, but this is under the assumption that the information is being censored, which I don't think anyone in this conversation is calling for. I'm certainly not suggesting that flat-earthers be jailed for believing in something really stupid, and certainly I'd never support them being shutdown in a public venue unless they got violent.
That said, and as I've kept repeating, YouTube and Google are not government entities, and they aren't required or even given incentive to platform horrible people, or people that they view as horrible. While I agree that it's a bit disturbing that Google is releasing a censored search in China, I don't live in China, and I was talking largely in regards to the United States, (since that was where the whole freedom of speech thing came up).
We draw the line all the time. If someone was in my house and started spouting off Neo-Nazi propaganda, I would tell them to leave, and I would be completely unimpressed with their argument for freedom of speech, as I think you would as well. Am I an evil totalitarian dictator because I don't want to give an audience to people I think are disgusting? Am I anti-free-speech because I'm denying the other members (especially children) of my house the ability to hear opposing viewpoints? Of course not; it's my property and I don't want disgusting people in there.
I would definitely prefer to keep open discourse, but my point is that I don't see how it comes down to the evil dystopian world that your comment indicates because YouTube doesn't want to recommend stuff that they view as dangerous.
YouTube has 1.8 billion monthly users - nearly one quarter of the entire human species' population. They have about 550% the population of the USA inside their 'house.' They have hundreds of millions more users than the largest country in the world has citizens. I think the only thing analogs to household (or even most business) rules emphasize is how inappropriate they are to considering what the most reasonable action in this sort of scenario is.
YouTube is a natural monopoly which changes the whole picture. It even works as a bypass for literal first amendment infringement from the government. Imagine a government entity wanted to prohibit discussion of a given topic. In past times, their only option would be to try to pass legislation against it. That's where the first amendment kicks in. In modern digital times, however, there's another option. They can simply apply pressure or offer incentives to e.g. Google and Facebook to ensure it ends up on their black lists. It's a clear violation of the spirit of the constitution without clearly violating the constitution. None of these issues came up when considering the constitution as the concept of things such as a private company having a monopoly on public discourse would be completely nonsensical.
I think it's completely unavoidable that the next socioeconomic movement of society will be to an overt corporatocracy. That's disappointing, but it is what it is. The only thing I wish is that people would realize is that these steps are exactly how we get there. This all effectively comes down to not only simply accepting a monopoly of this scale, but now further suggesting that this monopoly begin ensuring that the discourse is 'corporate approved'. I'm certain YouTube will be thrilled to comply.
Sure, I have an issue with YouTube being a near-monopoly too, and if the discussion came down to "YouTube is deleting videos for ideological reasons", I think your point would stand.
In this case, however, the issue came down to their recommendation engine. Even if you did view YouTube as a government entity, which I do not, I don't think it says anywhere in the constitution that the government has to give recommendation to every side of the argument, just that you're allowed to say it.
All that being said, I actually have been working on and off on a clone of YouTube using distributed hash tables, so maybe we'll be off of it soon enough :)
I definitely did pivot, but you did as well. You swapped between Google should censor to Google can censor. These are quite different issues, though they frequently end up intermingled, as in our chat.
Efforts at competition should not be neglected, but I think it will likely prove futile. There are already plenty of alternatives to YouTube, but that doesn't matter with a natural monopoly. Content producers want to go where viewers are. Viewers want to go where content producers are. Whoever becomes the 'big one' first, wins.
There's a fundamental problem. That is that content intended to be free by users is something that private companies then claim effective ownership of as a condition of being able to say anything. This is an interesting 'trick'. I call it a trick because let's say the average person posts something to e.g. Facebook or YouTube. Would they mind if another site, with attribution, also shared their content? In the vast majority of cases, the answer would be no. Most people are just posting things for enjoyment or to express themselves, they'd love if it got shared as much as possible. But other sites cannot share these users' content because e.g. YouTube or Facebook claim and defend exclusive ownership of what is posted on their site. You'd need to get a user's express permission to share their content, and that's not really viable.
Imagine for a second that we killed this trick. Companies that provide user generated content for free, or with a free account, could only publish content under non-free licenses if the content creator specifically opted in to that agreement. However, even if they chose to not opt-in the company would still be obligated to publish and treat their content identically to how they would have if the user had opted in. This would all go away if the company charged even $0.01 for access. The company could also incentivize users to opt-in, such as by paying them up front for their content.
The idea is to turn "free" into simply free. This would enable real competition since free access means somebody could simply start copying content created by users who wanted to post free content on e.g. YouTube or Facebook and share it in a different venue. The exact same would be true of comments and other such user generated content that was always intended to be free, and not "free", to begin with.
But so long as a monopoly is able to claim effective ownership of material users meant to be free, this system will likely only grow larger.
as no one really believes the earth is flat, it's just attention seeking behavior and trolling.
You're underestimating the stupidity of a vast portion of the population. There are plenty of people who believe not only that the earth is flat but many other conspiracy theories as well. Dismissing it as trolling is naive.
Free speech isn't something that was created by the US Constitution, it's a right that everyone has by default. It doesn't matter if it's a private company or a government, restricting what people are allowed to say is still restricting free speech.
What? No, that's not how this works. You have freedom of speech. I have freedom of association. This is why I don't need to host your ramblings. I can choose to tell you to get off my lawn (metaphorically) and go ramble somewhere else.
Is this really YouTube's fault? The President of the USA thinks China invented climate change as a hoax to reduce the economic output of the USA. Who can expect youth or anyone really to be okay in an age where the ruling parties are themselves the ones creating and pushing the wildest conspiracies?
and Youtube is a breeding and feeding ground for those conspiracies. More and more extreme content creates emotional reactions, clicks and binds attention therefore it is recommended frequently to optimise the only metric that matters for youtube: the time you spent on their platform - no matter what.
All the TV news networks then as well, right? Everyone who plays video of the Republicans spouting lies about the world, without directly commenting how wrong it is, would be equally guilty? More guilty?
This stuff starts with Authority. on Live TV. Not some rando youtube channels.
The responsible news networks really do put some effort into verifying claims and confronting politicians, sometimes the moment they claim it, and they do filter who gets on...
I don't think the article is even fact based to assume youtube has created a conspiracy theory boom. I remember the dial-up age and people typically knew some corner on the web for reading conspiracy theories to get some laughs. When 9/11 happened there were those conspiracy documentaries and this was before youtube was what it is today.
My guess if a boom is being created. The reason is from people being crafter at getting people absorbed in the conspiracy theories. Maybe the health & happiness of society has decreased and where people are now easily absorbed into conspiracy theories. Misery is a beast that pulls people into time sink holes to escape reality.
The problem with online opinions is that while the "opinions are like assholes, everyone's got one" adage is true, online nobody is there to tell you "stop watching this stupid thing". And we all know YT comments are a heap of garbage. Also YT/Instagram comments are fully moderated by the author, so removing all disagreements is easy. Easy to give the appearance of consensus.
And worse, once you watch one, YT's algos want to keep you watching so suddenly you are flooded with related vides.
This leads to a problem.