Hacker News new | past | comments | ask | show | jobs | submit login

I'm not sure about this particular case, and absolutely no question that the specific interest in this case probably has very little to do with concern about democracy and a lot with power struggles in that new cold war we're in.

(Gonna agree with it being a soft coup if they limit the new election to only pro-western parties. So far, it's "only" the repeat of an election)

But, having said that, there really is a lot of pro-russian propaganda on TikTok and the way the algorithm selects it can't always be explained with user preferences.

An Austrian newspaper recently posted results of an experiment they did themselves: They added a bunch of brand new accounts, pretending to be teenagers. The given interests were diverse, but all of them unpolitical and typical kids stuff.

Nevertheless, after a short habituation period of benign posts, the feeds of all but one of the accounts quickly shifted from typical teenager stuff to "political" content, mostly hard right-wing, islamist and pro-russian clips. All of that without any of the users ever having given any indication that they were interested in political posts, let alone pro-russian ones.

The report is here (in German) : https://dietagespresse.com/selbstversuch-so-radikalisiert-ti...

The newspaper usually posts satire, but this article was about a real self-experiment.




Medium hot take: this is why closed-source social media post promotion algorithms should be banned. We should not let a foreign private company with government links influence society in a hidden way like that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: