Hacker News new | past | comments | ask | show | jobs | submit login

I use YouTube infrequently. Usually for binging random content like comedy sketches or speedrunning. No matter what I'm on, there's always a few videos in my recommendations that make no sense and look like trash. Not generically, but specific videos that ALWAYS appear. One of them is "I PAID FIVE BASSISTS ON FIVR TO PLAY AN IMPOSSIBLE BASSLINE"

...

Why is this so bad? What are your hundreds of data scientists doing?




I think these are exploitation/exploration trade-offs.

YouTube only knows your interests by the videos you watch. It knows you like speedruns, so it will show you speedruns, easy. But you certainly have other unrelated interests: cooking, woodworking, old cars, whatever. So, from time to time, Google takes a chance and shows you a random topic just to gauge your interest.

It works on a global scale too. They want some variety, they don't want to depend only on a few topics and previously successful YouTubers. So sometimes, they may pick a random video and so it to everyone, and if it succeeds, they have something new.

It is actually rather clear when you look at your recommendations to determine which are which. Typically you are going to find a mix of videos related to the video you just watched, videos relative to your global interests, trending videos, and random stuff.

If some generally effective machine learning algorithm acts really stupid from time to time, chances are that it is exploring, that is: it is asking you a question instead of giving you an answer.

Edit: someone mentionned the multi-armed bandit, yes, this is such an algorithm


I don't think it's bad actually. I mean the recommendations could be far better, but this specific case isn't bad.

Say you go on reddit/r/iloveredcars - all you will see upvoted (ie ... RECOMMENDED by users) will be red cars. You'll never see blue cars. You'll never know they even exist. All you've ever seen are red cars. Not only this but you'll notice that over time it's always the same red car pictures being reposted over and over again.

By inserting "wrong" results a small percentage of the time this ensures that users will be exposed a tiny bit to different content, that may be they end up liking, opening an entiere new tree of recommendation opportunities.

Now I'm sure some people don't want this and they do want to see the same red car pictures over and over, but personally, I appreciate that they do this.


aka Multi-Armed Bandit


The function of an adtech recommendation system is not to find content that you want to see but rather to show you content that's the most profitable for the adtech firm that's serving it.

The videos that YT recommends over and over again are probably videos that have a high ratio of advertising revenue (meaning good demographics) to playback cost (meaning low overall byte size) and a high likelihood of going viral by being shared off-platform (meaning broad appeal, non-controversial, short, safe for religious conservatives, safe for work, and easy to summarize within 280 characters).

The videos you want to watch don't meet these criteria, so Youtube won't find them for you.


Recommending you a video you don't want to watch is a wasted opportunity. If you don't want to watch it, you won't, so YouTube's earnings are zero. If something you do want to watch is recommended instead, you will, and YouTube's earnings are nonzero.

I'm right now on a browser without my account and on which I don't usually watch YT, basically only when someone pastes a link while gaming. About half of the front page are things I might have clicked on when bored. A third are long. Two are definitely controversial. I don't buy your hypothesis.

Disclosure: I work in Google, but my closest contact with YouTube was drinking whisky with a YouTube SRE some four years ago.


I'd say there's definitely a lot of "LOOK AT MY YOUTUBE SEO TITLE" but in actuality that bassist guy is really good and generally plays some great riffs.


It's strange.

It's like a wild species with vibrant, sometimes garrish colors saying 'Look at me!', where to discriminant viewers it's typically poisonous.

I'm pretty cautious.

But for a large number of my favorite content creators it's a bait and switch. Bait your clicks and actually provide really meaty content.

That aside, my personal favorite trend in the digital media space are the 1-4 minute tutorials for people who already know the software.

These people typically make money outside of Youtube and don't rely on snaring views.


> "use YouTube infrequently"

> "binging random content"

gets recommended random popular content

> "why are recommendations so bad"

They can't read your mind, you know


They could probably infer that if I'm watching comedy,and the last 5 videos were comedy by the same artist, I don't want to watch some guy's channel about bassists that they've tried to get me to watch 100 times prior.


If they were only recommending comedy from the same artist you may then have the same issue as another comment, which is that recommendations aren't novel enough.

I'd take your point if they only recommended bass videos, but presumably it's just one recommendation in a list of many, with the point being to explore your preferences while also satisfying your demands


The problem everyone is complaining about is that they used to recommend a good mix of content that was actually relevant. Like if you were watching a bunch of comedy videos by one comic it would recommend more of them but also another comic that you would probably enjoy. I don’t know how they did it, maybe it was just based on what other people with similar demographics and interests watched. If it was then they probably broke it when they decided to pop the filter bubbles. Turns out some of those bubbles were useful, they just decided to throw the baby out with the bath water.


If they used to recommend a good mix of content, what do you feel is the issue with recommendations now? Not enough variety? Not enough relevance?


> They could probably infer that if I'm watching comedy ... I don't want to watch some guy's channel about bassists that they've tried to get me to watch 100 times prior.

davie504 is basically a musically-themed comedy act, so this doesn't seem that unreasonable.


Urgh him. I watch a lot of music videos (as in videos about making music + music theory). I can not get this guy off my recommends. He has the personality of a sofa cushion. I cannot understand why he's popular.

Other than that, I'm pretty happy with my recommends. I think the Google spooks have built a good file on me.


I use this firefox addin, it put a cross next to user comments and videos and lets you hide videos and comments from those users/channels in the recommended videos, searches etc;

https://addons.mozilla.org/en-US/firefox/addon/youtube-clean...


They stop appearing if you specifically mark them as not interested


I do this, and it does work, but it doesn't seem to learn that I don't want to see generic youtube garbage.

I'd love for it to learn that anything with a shocked face thumbnail and CrAzY question for video name is not desired.


Shocked face or a big red arrow pointing to something in the thumbnail make the video an instant pass for me. It would be amazing if YouTube's algorithm could recognize that.


Every so often I do that. I spend five minutes selecting channels I'm not interested in, because I watched a gamedev video once and now I'm being suggested hundreds of gamedev videos, or whatever. Pretty soon, it's like the algorithm just... gives up and starts suggesting me generic videos about celebrities and other vapid things that I'd never click on in a million years.


I often AM interested in a topic, I just don't want every little piece of content drivel shoved down my throat. I still want the cream of the crop to occasionally pop up in my feed.

That's the problem with marking things as not-interesting. The algorithm will wildly overreact in the opposite direction.


I've marked dozens of videos on a single topic as "not interested", only for them to be replaced by different videos on the same topic soon after. Marked all those as "not interested" again, and the next day there were more.

The only thing that works for me is finding the video in my watch history that's causing the recommendations and removing it there. Often it's even something I didn't watch all the way through or even gave a thumbs-down; apparently thumbs-down also means I secretly want more of this content?

YouTube's recommendation algorithm is somehow worse than useless.


The fact that feature exists shows that even YouTube know their algorithm needs help.


Everyone lauds TikTok’s algorithm for some reason, but they have the same thing and it seems just as worthless there.


Tiktok by design has more fresh content and by being newer is not gamed as much as Youtube's algorithm.

With enough time Tiktok will look as bad. I.e. I don't think it's algorithm related "for the most part"


Hahah! I get this one too.


> One of them is "I PAID FIVE BASSISTS ON FIVR TO PLAY AN IMPOSSIBLE BASSLINE"

Slap like right now




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: