I think it would be better if internet went not in the direction of removing stuff but in the direction of providing users with tools to avoid the content they don't wish to engage with.
We shouldn't rely on security through obscurity when it comes to dangerously stupid ideas.
If you are in the super market buying food, and people are giving out free trials of crack cocaine -> addiction city.
You can very regularly see Nazi apologia which is so dressed up and benign sounding. "Teach the facts", "did this really happen" - the debunking of which requires access to people used to debunking it.
The same rule applies to a variety of topics.
Simply put, the average person is not prepared for lawyer level rhetorical devices in the wild.
So either we remove content like that, or we live through whatever disasters a world of reality distortion creates. Most likely the survivors of that world will simply lock down the news down hard.
>Simply put, the average person is not prepared for lawyer level rhetorical devices in the wild.
This boils down to "censorship is okay because people might believe the wrong thing", and down that path lay a well-greased and well-trodden slippery slope ending in the maws of dragons.
It's incredibly paternalistic. It is not your job, nor the government's, nor anyone else's to protect people from reading/thinking/etc. the wrong things.
It actually was my job, I saw exactly what letting things alone did. I know how many ways your ideas are incorrect, because I worked from that same starting point.
If you believe that ideals are going to do the job, please feel free! There's 100s of subs that need mods. Take the time to find the most gnarly politically active sub and apply your ideas. Find the places where people actively deal with propo and indoctrination.
If you can solve what everyone else has failed at, every firm, regulator and moderator under the sun will cheer you on.
Till then, this is what works. Not because I like it, or because it's "paternal" - but because it conforms to how humans actually behave and how content performs online today.
IT is what it is, and we can only improve by knowing reality first.
Those attitudes and behaviors are merely engineered traits to make Reddit engaging and active, sad to see who seems to be an otherwise happy person being wasted like that.
I'll keep my idealism, thanks. I've seen what censorship does, and I find it to be a far worse monster than any of those spawned by the free exchange of ideas.
> Simply put, the average person is not prepared for lawyer level rhetorical devices in the wild.
Maybe we should figure out how to educate people to prepare them?
Can we really build our safety on our ability to keep stupid people ignorant of stupid ideas?
I think that's the kind of security through obscurity that is currently failing us so hard. Stupid people are finding stupid ideas anyways, sharing them, building upon them. Trying to keep them away from stupid ideas is like trying to keep your body perfectly germ free. You can't really do that. Not in the long term. You are better of figuring how to boost your immunity.
People aren't code. Plus you dont see people exposing themselves to prions or the plague.
More pertinently, you could argue that we shouldn't worry about sociopaths, and that if they convince or charm someone - that's just bad luck, and it boosts people's immunity.
Why did we worry about lootboxes or attention manipulating algorithms?
Because when its not a fair fight, the outcome is not fair.
If I sent you up against professional Nazi denying content, and you knew jack about the Nazis - I guarantee you would think they had a point. There are supremely persuasive books saying the numbers are wrong, or that the data doesn't add up.
There are some ideas that are simply that toxic - handling them without training is a sure shot way to radicalize someone.
=------=
Sure, I would love to have classes like their are doing in some countries.
First - I doubt that flies in America, where Fox news is one of the largest networks.
Second - even if you did start classes for kids and teens - it still leaves your adults vulnerable.
I don't even know if it is possible to de-radicalize the already radicalized. If you don't stop the radicalization, you will simply have more of such people.
So sure, get those classes going, but its only one part of the solution.
>People aren't code. Plus you dont see people exposing themselves to prions or the plague.
Aren't vaccines kind of this: deliberately exposing people to something harmful so they'll have more immunity against a stronger version if they encounter it later?
And yet we program them to read, write, do calculations, know some narrations about history of their civilisation and some narrations completely made up by fellow humans. Even know some things we discovered about our physical reality and some things we constructes.
> Plus you dont see people exposing themselves to prions or the plague.
Because they are way more lethal on consumption than any idea is. We still eat cows though and talk to each other. Just made it bit harder to expose ouselves to plague and prions on daily basis.
> More pertinently, you could argue that we shouldn't worry about sociopaths, and that if they convince or charm someone - that's just bad luck, and it boosts people's immunity.
Oh. You should worry. And figure out how they are created and let everybody know what made them this way, and exactly how stupid it was. And about hownto recognized you are getting charmed and what are the limits of your knowledge and how to expand them. And this exposure will boost the immunity.
> Why did we worry about lootboxes or attention manipulating algorithms
Because we should worry why people out of their free will make stupid decissions.
Remember Farmville? It looked like a very dangerous stupid idea. Are we safe from it because we banned it or because we got bored with it? Because we learned to recognize the lie?
Before we ban loot boxes whole generation will learn how stupid they are, even without any concerted effort. Will the next generation know that? If the ban is successful then no. And they will be vulnerable to the nexy gambling scheme that comes along in their life.
Who will be better prepared to face gambling as an adult? Person who got bored with lootboxes? Or person who never seen them because they were perfectly protected?
> If I sent you up against professional Nazi denying content, and you knew jack about the Nazis - I guarantee you would think they had a point.
You are making my point for me. You will encounter nazi denying content in your life no matter how hard you ban it. Better be prepared to recognize fake points. Same goes for antivaxxers, flat eathers and such. Those things being exposed and ridiculed for what they are done way more to harm attractiveness of those stupid ideas than any ban we imposed or might impose.
> There are some ideas that are simply that toxic - handling them without training is a sure shot way to radicalize someone.
Someone, sure. But they don't immediately die or kill from internalizing toxic idea. There's a window to act, and there are ways to boost resistance to stupidity.
> First - I doubt that flies in America, where Fox news is one of the largest networks
Isn't that the core of the problem? That advertisers are allowed to slap their ads on such stupid content and boost it with their money and make avoiding it extremely hard?
Instead of banning specific ideas we should regulate advertising.
How much harm could be done if youtube slapped ads on everything corona-related as it trended because of pandemic? Instead they demonetised anything that even mentioned corona. It had sobering effect on youtube creators.
If there was a law that could demonetise Fox, how long would they last and how much impact would they have? How long would Alex Jones last if he was not allowed to lie while advertising his supplements?
To combat problem of stupid ideas you have to incentivise things that make use better at recognizing them and disincentive things that shove them in our faces at high power.
> Second - even if you did start classes for kids and teens - it still leaves your adults vulnerable.
Yes. Adults are harder. Maybe antivaxxers and nazis should be something that only kids and qualified adults are allowed to talk about on adveritser financed content. ;-)
I don't know exact solutions but we won't figure them out if toxic ideas are made legally taboo. Same way that banning specific drugs left us very slow to understand and deal with addiction and completely opened for opioid abuse epidemic.
> I don't even know if it is possible to de-radicalize the already radicalized. If you don't stop the radicalization, you will simply have more of such people.
It is possible. I'm in the process of de-radicilizing my mom from antivaxxers propaganda and I'm doing in by educating myself on what are they saying and why exactly it's stupid (specifically and as an example of wider misinformation patterns) and relating it to her in ways she can understand. It's work, but it gives results.
Why do I have to do it? Because she was nearly 60 when she first found this level of advertiser supported promotion of something this dangerously stupid. And she lived through state mandated communist propaganda with way less personal and intelectual harm.
> So sure, get those classes going, but its only one part of the solution.
You can't get classes going if you make the subject forbidden knowledge. Just limit shoving propaganda in peoples faces till they properly learn to recognize it. Make possible to avoid it and make harder to encounter it without accompanying critique.
Best solutions imho appeared where Facebook couldn't ban Trump. They had to factcheck him and annotate his stupidity with truth.
Unfortunatle since they no longer fear him they swiftly shed the buden of educating people and reverted back to lazy solution of just banning him from the platform and happily promoting other dangerously stupid ideas (that are not under such strong public scrutiny as Trump). And people cheerfully allow them to avoid resposibility for the power of their platform just like that.
We are all acting as if ideas drivr this recent way of stupidity but what drives this is advertiser money.
Sorry mate, these are assertions - I too have held them.
I simply have tested them, at length.
Which is why I encourage you to go ahead and join the fight, go volunteer earnestly as a mod.
If you survive burnout, you will see the contours of the battlefield - our human neurological weaknesses, the resources strung out on either side, the fragility of the structures we need as a society.
I repeat - all of these are simply assertions, and as a forum of techies, we should be testing these ideas - people here created tech which allows us to do this like no generation before us.
In the spirit of science - go for it.
I’ve spent years studying this, even wrote a dissertation on the subject space. I’ve had these discussions many times, I’ve seen what happens when people decide I am wrong and set out to prove it.
I have always wished people who take that task on, the very best. If you reach a different conclusion, I will be the happiest person in the world.
Props to you for putting the work in and I don't deny your conclusions. Researching this must br very hard.
I just believe that there could be hope on epidemiological scale, even though we have to accept that many, many cases are irrecoverable.
I believe that average resilience of people to stupid ideas could be improved. And I believe so because social networks degraded it with very specific mechanisms (like mere exposure effect), not just by putting people in touch.
We shouldn't rely on security through obscurity when it comes to dangerously stupid ideas.