I am worried about information dumping -- just flooding the Internet with insane amount of AI generated data to drown the "real" data. And I dont mean "real" necessarily in a sense of genuine or man-made, but simply fake or slightly divergent. Imagine a world where only 1 out of 1000 news stories or tweets is real... the amount of damage you can do to institutions, democracies, causes... just continue fucking with people to the point where they have no idea what is real and what is fake and give up the truth-seeking altogether and give in to the loudest/most dominant narrative. Invest money into a an AI farm that just spits a fake every second and see if the fact-checkers and well-researched alternatives keep up.
You don't have to imagine that world, it is already here. The vast, vast majority of "news" sites out there is already straight up propaganda or ad-focused pseudo grassroots bullshit. Maybe not yet completely AI generated, but that seems like it would only be the cherry on top of the cake for the people behind it. We need solutions for this and not ways to prevent what has already happened.
There's so much low-quality content on the internet now - and it gets even worse in comment sections - that the only way you'll be able to detect if something is AI written if it's actually higher quality, or coherent.
True. That also requires a good portion of critical thinking and education in general, which is, let's be honest, rather rare in most comment sections. I fear that in the future we'll look at today's YouTube comment sections thinking these were the good old days ...
People lying were always faster than those doing journalism. In the past you just followed sources you thought reliable. Then came social media, where reputation doesn't matter and the fastest wins, driving the current brand of "decline of journalism".
Inventing faster ways to lie won't meaningfully change the equation, the driving forces are the same.
Before printing was expensive, so you usually had some established company that could be liable for libel. Now the hosters are exempt (not necessarily the wrong choice), and have millions of publishing members posting stuff rapidly that you could never address with libel lawsuits.
Before people would maybe be talking and rumor slandering all over the place in person instead, but the literate-targeting part of media was operating under some ground rules for debate of at minimum no malicious libel.
I‘m not too worried about this because we already have this to an extend where it is required for a particular publisher to build up a reputation. People will learn to pay more attention to the reputation and publishers will apply more scrutiny in return
>People will learn to pay more attention to the reputation and publishers will apply more scrutiny in return
A subset of the people will but most will not. Right now obviously fake news stories are shared left and right, without even a modicum of rational analysis applied as to their plausibility.
The average person is simply incapable of rational thinking about anything that is not day-to-day and this will not change.
Indeed, Neal Stephenson in _Anathem_ (2008), in describing an alternate world (in which his "reticulum" is our "network") wrote "Early in the Reticulum—thousands of years ago—it became almost useless because it was cluttered with faulty, sbsolete, or downright misleading information."
"So crap filtering became important. Businesses were built around it. ... " Generating crap "didn't really take off until the military got interested" in a program called "Artificial Inanity".
The defenses that were developed back then now "work so well that, most of the time, the users of the Reticulum don't know it's there. Just as you are not aware of the millions of germs trying and failing to attack your body every moment of every day."
As I review this now, I'm sad to realize that our flood of "bogons" (bogus information) is not generated by opposing armies wanting to plant misinformation but by advertisers, influencers, and politicians. In effect, those who want to sell us things we don't need, didn't want, and often can't afford.