Hacker News new | past | comments | ask | show | jobs | submit login

> "They just don’t want to take responsibility for what that algorithm does"

We need to stop pretending that things in your YouTube/Facebook/Insta/Snap/TikTok feed suggestions are chosen by an Algorithm. It's a computer program directed by humans with an agenda.




Rather than subscribing to this conspiracy theory, I can believe that it's The Algorithm (tm) doing it. It learns from what videos are engaging to other users, and serves you the same stuff, if you don't follow the pattern, then it might serve other videos that has the 2nd highest rate of engagement...

I watched a tennis video on YouTube, and YouTube kept recommending me tennis videos. Same thing with LEGO videos on Instagram, I do admit they do grab my attention and so the Zuckgorithm knows "Keep showing him LEGO stuff!". I also like car videos, but Instagram also serves too many videos made by jagoff posers showing off luxury/sport cars.

Whereas for a "typical" woman the videos are about clothes and make-up...


>> It's a computer program directed by humans with an agenda

It most probably is a program written by humans, and presumably it is designed to measure and optimize certain metrics such as clicks or ad views.

> Rather than subscribing to this conspiracy theory, I can believe that it's The Algorithm (tm) doing it.

"These sources reveal that in addition to letting the algorithm decide what goes viral, staff at TikTok and ByteDance also secretly hand-pick specific videos and supercharge their distribution, using a practice known internally as “heating.”"

https://www.forbes.com/sites/emilybaker-white/2023/01/20/tik...


The TikTok feed suggestion mechanism in particular feels like an engagement seeking algorithm with a few hundred content safety rules taped on top of it.

Almost like LLM content moderation, I don't think it's possible to fully control. Especially given the adversarial nature of content creation & consumption.

Onlyfans models and Far Right political groups seem especially effective in finding ever more indirect ways to escape the moderation rules.

For example, this week there were a series of videos in my feed making racially controversial statements using Golden Retrievers and Pitbulls as euphemisms for persons of different ethnicities.

I'm pretty such the TikTok mods don't actively want racist content, but it's very hard - to the point of impossible - for them to fully remove this stuff entirely with the pace of evasion techniques.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: