Hacker News new | past | comments | ask | show | jobs | submit login

But doesn't YouTube try to reinforce the rule by flagging such content and remove them?



If it's not overtly violent or sexual, my experience is that they actively promote it.


Google very much leans against conservative content. By this point, big tech's suppression of conservative viewpoints is so plainly obvious and widespread as to be Occam's Razor.


Firstly, they don't, not even a tiny bit. All they care about is "numba go up" and "numba not go down".

The fact that they've rustled your jimmies means they've mostly succeeded in their supporting objective - getting you to think they are actually engaged, and not just playing whatever side serves their ends at this current moment.

Secondly, if you don't think they aren't promoting reactionary claptrap for clicks, I have to presume you don't use their services all that much.


Anybody's guess whether they're doing it right now.

YouTube has been the first really big experiment in libertarian algorithms serving functional purposes. They tried to get 'engagement'. Succeeded in the sense that there's no rival YouTube in practice, so yay them? Unfortunately the most efficient method of getting humans to provide this engagement is to turn 'em into monsters and cultists, something the algorithm did not account for.

And here we are: the algorithm doesn't get to just prioritize engagement anymore. I honestly do not think Google were serving political ends there. I think they may have drawn the conclusion that seeking raw engagement inherently serves certain political ends, even ends that threaten their own company's existence and freedom, and therefore they are trying to bump up a level and define what is 'good' engagement, without abandoning engagement itself.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: