Hacker News new | past | comments | ask | show | jobs | submit login

> if I watch a video on vegetarianism, then another one, it may strengthen the confidence that I am interested in vegetarianism, and conclude that they should be recommending me more content about vegetarianism

Recommendation algorithms don't "conclude" anything though, they are merely statistical tools to ensure you get somewhat relevant content, but they are never perfect because there is a lot of noise in what everyone watches.




Recommendation algorithms don't "conclude" anything though, they are merely statistical tools to ensure you get somewhat relevant content, but they are never perfect because there is a lot of noise in what everyone watches.

A. Jeesh, there's no problem with informal anthropomorophizing in this situations. When humans have a goal and feedback towards reaching a goal. When a human gets positive feedback that X gets them towards the goal, human choose X. The combined system Google-corp+developer+algorithm is also goal seeking and making choices so anthropomorphizing the system is appropriate.

B. The problem we're talking about isn't "noise" but "feedback" - a goal-seeking-system that muddies it's final result with it's initial state. Essentially, bias, a situation that's quite common in statical systems.


Not perfect is understatement They are horrible. Not just because of what the article is about, but because their recommendations are ridiculous avwn outside political content.

And youtube rules also motivate providers to churn out a lot of content regularly (leading to quick to produce crap) and punish those who take their time to think or research before they talk.


It becomes much less horrible when you consistently train the algorithm with the "I'm not interested" button. You can even choose if you're not interested because you already saw the video, dislike the theme, or dislike the channel. My YouTube home page is pretty good now, after doing this about 20 times. Any time I watch something vaguely conservative and start getting "SJW CUCK OWNED!!!" everywhere, one button press is usually enough to instantly revert back.

The problem is probably 1% or less of users ever click it.


>Not perfect is understatement They are horrible.

They would be much less horrible if every time one watched videos, they would rate them in a way, instead of assuming that "watching"="interest".

> And youtube rules also motivate providers to churn out a lot of content regularly (leading to quick to produce crap) and punish those who take their time to think or research before they talk.

That's not just Youtube: about all media is driven by what's "new" and "hyped" rather than what is deep and well thought about.


Watching does = interest if you continue to stay on the topic. Interest certainly does not mean approval, and it often can mean "I find this content extreme and outrageous."


I am interested in topic of WWII including seeing Nazi movies. I am not interested in nazi stormtrooper adjacent alternative history channels youtube recommends to me as a result.

I am interested in scientist or writer etc youtube channel about her topic. Not so much in confident-Johnny-cranked-out crap on the same topic.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: