Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] At Least 15% of Reddit Content Is Corporate Trolls Manipulating Public Opinion (medium.com/collapsenews)
150 points by nabla9 10 months ago | hide | past | favorite | 80 comments



The title is false. It seems to derive from the following paragraph:

> The results were alarming, with 15% of the top 100 subreddits found to have content that was likely posted by bots or corporate trolls, specifically aimed at promoting certain companies or organizations.

It's not 15% of content. Nowhere even close. It's 15% of top subreddits that might contain just a single post considered "likely" (no actual verification) to be from what they call a corporate "troll".

In other words, it's a completely meaningless statistic.


None of the citations match the claims, either. The author appears to have made up one of its most important citations while misrepresenting the other [0], which is par for the course for this author [1][2][3][4].

[0]: https://news.ycombinator.com/item?id=38700636

[1]: https://news.ycombinator.com/item?id=38627266

[2]: https://news.ycombinator.com/item?id=38523309

[3]: https://news.ycombinator.com/item?id=38371060

[4]: https://news.ycombinator.com/item?id=38199597


I would argue that anecdotally it's been well over 15% at various times.

Nearly every second post on /r/movies for a few years was "Checkout this amazing old classic that's way better than I remembered, it's streaming on Netflix now!"

Also really hilarious when a new movie drops and you read the comments in /r/movies. It's so, so simple to spot the generated "I found this refreshingly hilarious and fresh" comments. "I found the character development very engaging" or "the Director did such a great job on this".

These are always in among the real comments where people sound like actual people, and often say the movie was crap.


I came straight to the comments here because I absolutely knew the headline 15% couldn't possibly be correct. Thank you for the clarification.


Well...yeah. With some free time about a year ago I did a bunch of the Reddit ads formula modules and then veered off into reading threads from actual users who were seeking guidance on how to leverage the platform this way (without paying Reddit to run a campaign) and they all discuss this practice.

On the 'official' side, Reddit publishes the things you'd expect: - run your ads on our cool platform to reach your customers of tomorrow...today! - case study/success stories/etc - Reddit is a great platform to improve sentiment about your_brand!

On the 'un-official' end, users discuss how to circumvent actually paying the platform to achieve their desired results and rely on astroturfing or other clandestine methods because it's much cheaper with a much more attractive upside if a conversion is landed. There is also a lot of scammer accounts that send junk like this: https://imgur.com/a/q106SKI


I've encountered specific situations on reddit where there was a clear government organized campaign influencing public opinion using bought accounts. This was in 2017 before GPT like AI generated content was a thing. It included comments, links to news articles and clear coordinated upvoting.

It seems that you wouldn't even need AI generated comments. Just an AI with sentiment analysis that upvotes organic comments that support your views and downvotes organic comments that oppose your views.


You can also do the opposite.

E.g. if somebody posts yet another coke/mentos meme or video you can link those threads to research showing how damaging the packaging is or how they once supported the assassination of unionizers.

They dont like that sort of thing at all.

I caused amazon to nuke an associate IAmA thread once because links about what it was really like to work in a warehouse kept popping up and interfering with the corporate approved story.


I think that is fine with corporate accounts. But I would not do that for government organized campaigns.

I know this is a controversial opinion on HN and it borders on paranoia. But I consider a few ideas. First, to a determined government there is no such thing as an anonymous account. I act as if my reddit, HN, Instagram, etc. is just a column in a database that is linked to my ip address, email, phone number, physical address, government id / passport. Second, I act as if every post I make is being analyzed for sentiment against a set of categories of content. If I trigger enough of those sentiment categories then I end up on a list. I'm sure I am on some lists already - I would prefer to minimize the number of lists that I am on.

It seems to me that purposefully trying to subvert a government organized information campaign would get a person on the kind of list that I would prefer not to be on.


I do this in youtube comments whenever I see a chain of "I invested in ____ and made 1700% in 4 months!". I'll reply with "my friend tried that and lost $8000, nobody will return her messages".

I'm surprised code+mentos is used as product placement. it seems like it's damn close to being a bomb (if you were clever enough to screw the lid on before triggering the reaction) and thus a liability.


I think the more interesting question is whether anybody who is being paid to manipulate public opinion is a moderator on Reddit. A moderator position has to be much more useful to an astroturf actor than a mere poster because an astroturf mod can remove content that whomever is paying them doesn't like.

Reddit and Wikipedia are the major platforms most vulnerable to this kind of attack because they're moderated by volunteers chosen from within the community.


It wouldn't surprise me if there were subs with this problem.

There's definitely subs where it isn't a problem, and any kind of astroturfing is strongly stamped out by mods (e.g. /r/personalfinance runs a relatively tight ship given the size of the sub).

But it wouldn't surprise me in the slightest if there were subs with mods whose actual day job is astroturfing.


I am fairly convinced that /r/worldnews is modded by, and overrun with, US State Department and other spooky-type shills.


The main mod /u/Maxwellhill is Maxwell ghislaine. Everything matches up including the date of arrest


r/middleeast


Given YouTubers that have access to a few million eyeballs are paid 6 and 7 figures a year by brands, I would be shocked if a few of the reddit mods that mod almost all the biggest subreddits are not taking cash on the side. They literally control what hundreds of millions of eyeballs see.


The problem is getting into a position to actually moderate a sub.... or building a community entirely from scratch


At least 15% …this is correct. Because the true number is likely much higher. Outside of niche communities Reddit is just another government and corporate mouth piece. It helps to manufacture consent and control by simulating “conversations” and normalizing positions that often do not reflect reality.


15% is a gross gross understatement but its probably across all subreddits. On mainstream subreddits absolute majority are astroturfers, bots and propagandists. Majority of bots also perform sentiment analysis and upvote comments that manufacture consent.


Not just government or corporate either. Basically, it's an "anyone" sock puppet. And most groups, corporates and governments use it. From the US government to the Ku Klux Klan.

People just have to learn to question everything they see on reddit, and indeed on the internet in general. Including every post on this comment thread. Including mine.

Act with wisdom by being skeptical of all of it.


At most 15% of Reddit content isn't sockpuppets


I don't know 15% may be correct but the problem is that when a large enough number have put their weight behind something the real people fall in line and finish the job.


Wow. Wow. I've always suspected this, especially after we had our daughter. My partner would be on new parent groups and then over time she would hear about new products or bring up new ideas to solve some problem I didn't know we had. I find it really pernicious that the tactic is to inject doubt and uncertainty about your adequacy as a care taker. I got really suspicious but she told me it was brought up by other "moms" in the groups. I had my suspicions but seeing it somewhat confirmed is something else.

Argh.


I can't even open Reddit anymore, it just shows a text blob with something about the user agent (which hasn't changed). I wouldn't even care but much of my first page of search results is full of Reddit links.


They've just started blocking some VPNs apparently. I'm getting the same.


It's not just on reddit though. It's everywhere and goes way back long before the internet.


I wish there was more transparency regarding similar shenanigans around here. dang's message to Jeff Geerling was revealing.

https://www.jeffgeerling.com/blog/2022/i-almost-got-banned-h...

> https://news.ycombinator.com/item?id=30273905 was heavily upvoted by a criminal spam service that steals accounts from HN users and then sells upvotes and comments using the stolen accounts. That's basically the equivalent of a capital offense on HN and we ban accounts and sites that do it.

Don't think that we are too small to matter. Even on our tiny community, stealing accounts and selling upvotes/comments is lucrative.

There is also a recent intel report which says bad actors have been moving to "alternative online mediums" but they (frustratingly) don't list them.

https://www.dni.gov/files/ODNI/documents/assessments/NIC-Dec...


Operation Mockingbird


LoL the government trolls are here in force, as well.


This is certainly evident in media-related subreddits. The usual hypeboys will post about Upcoming Release, little tidbits from the original author or a behind-the-scenes shot. This continues on some kind of increasing curve probably first graphed by Skinner until the fans are trembling like teacup poodles at the sound of a can opener at dawn. Anything less than frothing enthusiasm will earn you a bevy of downboats from the "Don't Ask Questions, Just Consume Product and Then Get Excited for Next Products" Corps.

I occasionally like to remind them of just how gullible and torqued out of their minds they were for The Dark Tower. Ildris Elba! Stephen King said a nice thing about it! Meanwhile anyone with a shred of sense could sense the oncoming trainwreck, Blaine or no. But these folks are part of the advertising infrastructure that extends all the way out to Rotten Tomatoes, now shot-through with strands of fungal falsity, and for every Morbius-like copypasta of ridicule which finally emerges, another twenty tentpole releases, artificially extended, will prop up the entire moth-eaten and threadbare canvas.


The sad part was seeing any franchise or TV show be coopted by corporate suits - to the point where you can be banned for not liking the latest episode. The situation reached a point where every franchise has the "official" boring, sanitized sub and an alternative, community-driven one.


I expected it to be much higher than that


The article says "at least" for a reason. To anyone that pays attention, it appears to be closer to 80%.


Possibly like most discussions the loudest few totally control the conversation.


Theyve got voting guns.

If you see a thread on e.g. /r/programminghumor with a lazy, unfunny meme starring $BRAND it will usually have an oddly high number of votes relative to comments.

The votes look rigged in the comments in those threads too.

Point out the corporate shilling or mention /r/hailcorporate and you'll wind up with -5 votes and some aggressive comments telling you to lighten up or that you're wrong or both.


Are there any large subreddits (>1 million) that aren't a joke?


/r/AskHistorians


Tangent: As an anthropologist by training these people infuriate me by not letting the peanut gallery chip in on a related thread. They might have every living source on a subject in the same room and they'd never know it.


I can't shake the feeling that this is just people rediscovering this as an issue, this has been happening since the digg exodus hasn't it[0]? The frequency and sophistication has obviously increased, but I thought this was pretty much public knowledge.

[0]https://www.reddit.com/r/IAmA/comments/p9a1v/im_woody_harrel...


Corporate influence aside it's a bad place to get your opinions from due to the hive minded echo chamber comment/post ranking system. Any contrarian opinion that contests the most popular opinion is simply not shown because over half the users will down vote it bury it. To make it even worse the the mods are usually unreasonably power hungry or opinionated too. You can kind of have a debate deep in a comment tree but that isn't very visible.


Last summer I wrote some software to coordinate Reddit bots to do this [0]. It was pretty cool watching it go. A combination of the API change and my innate hatred of ads made me stop working on it however.

0: https://github.com/j-jacobson/buzzmasters


This self-reported characterization of corporate manipulation seems suspect. If I get up here on HN and say "Gmail is a good service that works well and respects your privacy and security" 95% of you will accuse me of being a corporate shill, but it's just a vanilla opinion. Anyway, this author doesn't bother linking to the Pew study and searching for "The Pew Research Center study, conducted in 2018, delved into the experiences of 2,505 adult Americans who use Reddit" reveals nothing, so I can't be sure their description of the study is accurate, or that it even exists. I'm certainly not going to sign up to read this members-only post, the free part of which is almost worthless.


I fully agree that the self-reporting in this (alleged?) study is suspect. I've been accused of being a corporate shill many times on reddit just for voicing an opinion contrary (or even just partially contrary) to the bulk of the thread. This seems to be a lot of people's favorite way of dismissing dissenting opinions (accusing dissenters of being trolls is still a close second, though), and it's likely that it makes other people believe it's actually true when a few people start accusing dissent of being corporately motivated in that way. It's such a common occurrence that I refuse to comment in most larger subreddits. Smaller subreddits are much more reasonable, in my experience.


The article is paywalled and cites a paper published on an Elsevier platform. Is this anything other than wild speculation? Seems like the kind of content that goes viral because it validates our gut instinct.

Not that I disagree with the made up number. I expect most social media to have a very high amount of manipulative content, especially the high-visibility content.


I can't even find the paper. The full article can be found at [0] and there is no citation for the study, just this:

> Two significant studies, the Pew Research Center study conducted in 2018 and the Computers in Human Behavior study published in 2020, have shed light on the prevalence and impact of corporate trolls on Reddit.

So I went looking for 2020 publications in the journal, and found 8 articles containing the word "reddit" [1], none of which were about corporate astroturfing.

[0]: https://archive.is/D60ep

[1]: https://www.sciencedirect.com/search?qs=reddit&pub=Computers...

Edit: When I include 2021, I get more articles from 2020, but still nothing related to corporate trolling on reddit or even just astroturfing on reddit in general.

Edit 2: The Pew study, on the other hand, is linked [3], but it's from 2017 and (from what I've skimmed) it doesn't talk about corporate trolling/disinfo other than to say that it happens and linking to news articles which are merely reporting on fake news and thus do not substantiate any percentage claims.

[3]: https://www.pewresearch.org/internet/2017/03/29/the-future-o...

Edit 3: I checked the reddit links cited as sources and none of them seem to support any percentage claims, either, just that disinformation campaigns exist and that trolling exists.

I have to wonder if the author is making stuff up as part of some broader agenda or is just trolling, because if this is just a troll, well played.


>I have to wonder if the author is making stuff up as part of some broader agenda or is just trolling, because if this is just a troll, well played.

That's what was alleged in comment to another article from this medium blog:

https://news.ycombinator.com/item?id=38627266


Wonderful.

I did some searching at my university's library to get a broader journal selection and cannot find anything related specifically to reddit disinfo campaigns beyond US elections, but even then, nothing to suggest a specific % of reddit posts are fake. Judging by the author's other works, they seem to be on par with Real Raw News for bullshit artistry.

Sadly, this fake news is spreading like wildfire: https://www.reddit.com/submit?url=https%3A%2F%2Fmedium.com%2...



@dang the author has created yet another account


the article is true, 15% of reddit content is bots or corporate tolls. Thats what the study says.

All the articles are backed up by citations.

these are based off surveys, this is why so many people upvote, they know its true.

Stop trying to silence people for political reasons



This mentions trolls and bots that contact people those are obvious and this context of limited interest in my opinion.

The professional propaganda dissemination is the frighting part, and it is far more pervasive than this study indicates.

That is not specific for Reddit, it goes for all big sites. I am entirely certain that it is also the case here on this site.


Most visible once openai launched their product. Suddenly people were parroting “cat’s out of the bag”, “ai learns like a human”, “people already getting replaced by ai”, “ai is helping me be a better programmer” and a bunch of other lines repeated in a nearly identical format.


It has been quite visible for a few years now. Especially in the last election cycle, you'd see some rhetoric promoting a specific view and it would be repeated word-for-word around reddit. Then when someone inevitably wrote a response that really debunked its bias, the posts would stop for a few hours. And then they would come back, modified to handle that debunking.

It wasn't even subtle... it isn't like I sit around scouring every discussion on reddit, but it wasn't hard to see that pattern even just as a casual reader browsing the site to kill a little time.


During the 2016 election it was hilarious to see that any mention of the name “Hillary” in any context would immediately receive downvotes and a bunch of negative comments.

Especially on Twitter it was just madness. Some guy with like two followers showed his tweet history (predictably with little community engagement), but then he posted a comment mentioned her name and instantly got a bunch of vaguely-related replies.

That people fell for this just blows my mind.

I have friends in Australia that still hate her for - you know — all the unspecified things she’s done. They can’t list any of them of course, and have no real idea of why they so viscerally hate a retired foreign politician, but they do.


I saw a lot of comments like that on HN, especially about Copilot. Honestly they confused me because I used Copilot for a while and it was like they were using a totally different way better product.


Exact same experience with chatgpt. While it may work as advertised for basic use cases in most it feels as if people are using something completely different. At the end of the day if these products were really that good they wouldnt have to use dark patterns to force their adoption.


not sure openai has to do that. I was skeptical before trying chat gpt but once I did I was like wow... and it does make me a better programmer and more productive in so many ways.



I'm sure it's not just reddit but all social media and sites like HN.


The 15% could be fraught but I feel like it exists, especially when people critique full self driving services in r/SanFrancisco


If you ever mention glyphosate on Reddit, or even Twitter or Youtube, corporate trolls immediately come out and start attacking. It's crazy to see what groups/industries have the biggest presence.


Just a reminder that this happens on Hacker News, too.

Check out one pro-Google astroturfing account that they caught back in 2020 and froze for posterity:

https://news.ycombinator.com/item?id=22383746

Apparently this was a sock puppet using other accounts for the same purpose.


That's nothing. The front page seemed virtually taken over a decade ago.


s/reddit/internet/


https://twitter.com/reddit_lies/status/1619567947349098496

"Daily reminder that Reddit accidentally outed Eglin AFB as the most “Reddit addicted” city.

Eglin AFB has been a huge part of the Pentagon’s social media manipulation campaigns for over a decade now."

URLs given at link: https://web.archive.org/web/20160410083943/http://www.reddit...

https://arxiv.org/pdf/1402.5644.pdf


The Pentagon doesn't know about vpns ?

What about Oak Brook, IL and South St. Paul, MN ? Both cities seem quite small, are they also Pentagon's playground ?


That's not the most mind blowing thing.

Reddit director of policy is a hire from NATO. She was in charge of middle east/russian containment as part of NATO then moved to reddit for "social media warfare". Check out the current reddit director bios.

Here this person is talking about Russian/Syrian deal: https://www.youtube.com/watch?v=GbgpwDVS1aA

Here are Syrian rebels (ISIS) requesting US support to fight Russians: https://www.youtube.com/watch?v=pEcyMdS9pl4

Here is reddit director, then NATO strategist, Op-ed in NYTimes requesting to arm Syrian Rebels (ISIS) with anti-aircraft to "fight Russia" : https://www.nytimes.com/roomfordebate/2016/02/22/is-there-ac...

If you think Russia invaded Ukraine in vacuum and US is being benevolent, you win the geopolitical darwin award.


No one thinks Russia invaded Ukraine in a vacuum. Putin is such a weak, weak leader that he had no choice but to walk into Dark Brandon’s trap. It’s really too bad Russia doesn’t have any agency to leave the quagmire they’re in, but that’s what happens to weak countries with very weak leadership. So now they’re left with no choice than to sacrifice more Russian soldiers than the US lost in both theatres of WWII, just because the real decisions in Russia are being made in Langley, Virginia.

Or something.


American tech firms are propaganda and information warfare outlets for the US.

But so what?

Russian tech firms are propaganda and information warfare outlets for Russia.

It's disingenuous in the extreme to call out the US for this behavior when every great power on earth engages in the exact same behaviors.


We need "the Reddit Files" next.


I'd love to see a spoof army recruiting video:

I'm Gina Lopez, Weapons Systems Specialist!

Hi, I'm Rod Gunderson, Combat Medic!

And I'm Eddie Sorenson, Social Media Manipulator!

"We're ARMY strong!"


Why spoof it when the actual MOS for our Eddie up there is 37F "Psychological Operations Specialist"


I was aware that stuff like this is going on, but I think the spoof video would show the juxtaposition of how one of these things isn't like the other. I'm sure this has been discussed extensively so I'm probably not adding anything by saying this MOS crosses the line with the military actively engaging with civilians, during peace time. Manipulating netizens is not "serving your country".



Attractive woman in Army uniform used on social media, you say? ...

https://mronline.org/2023/06/08/from-simp-to-soldier/

"If Lujan feels like a psyop (a psychological operation) it is because, technically, she is. Lujan is a psychological operations specialist; one of a small number of Army personnel whose job is to carry out influence and disinfo operations, either on or offline."


You mean Lujan?


The title is not supported by the text in the article.


I saw the same thing, the article is discussing people "being contacted by bots run by companies" is a billion light years away from "15% of content on reddit is corporate trolls"


[flagged]


Then again, what is an advertisement if not a manipulation of public opinion. So maybe this result is not surprising after all.


As a general rule, if I see a finance announcement on the news, I immediately assume the opposite is true.

“Nobody has to worry about their jobs.” — time to polish the resume.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: