The trick to improving YouTube recommendations is to regularly prune uninteresting videos and channels. Eventually this should start surfacing videos you're more likely to find interesting.
I've actually hit the end of my recommendation stream on the mobile app after marking dozens of videos with Not Interested. Once this happens it starts recommending videos you've already watched or older videos from creators you've been watching lately.
Personally, I'd love it if you could curate the recommendation system a bit more. I'm never interested in rewatching stuff I've already seen. And the recency bias for recommendations means you're always getting more if what you've recently consumed. I'd like it if more older interests were mixed in. Finally, I'd love to be able to blacklist certain topics and games entirely.
There's so much missed potential with YouTube recommendations.
Some might say that you can still use search, but that has been crippled to include unrelated recommendations after one page of results. Not only that, I want YouTube to help me discover new content. When the recommendation system works and it shows me new kinds of interesting stuff, it's superb. TikTok is eating YouTube's lunch in this regard.
Edit: forgot to mention this. On mobile if you do a recommendation refresh it'll show a New to You filter which you can use to explore more diverse recommendations. I don't know why it's hidden in this manner.
At some point Youtube search was neutered. I get 5-10 results for any given query before Youtube gives up and just offers me recommendations rather than search results. Interestingly, if I search google for [search term] site:youtube.com, I actually get pages of results which are never present in the Youtube search. Sadly, it seems that Google has intentionally diminished Youtube's own search function in order to prioritize use of the recommendation system.
Search is entirely crippled for new stories now. If I saw some breaking news on Twitter I used to pop over to Youtube to find original or better footage than news services were getting, or to get it faster. Impossible via YT's search now, it's just pages and pages of old-media channels with their thoughts on the footage, always clipped down and talked over. Even Twitter seems to do this now, it's becoming harder to find stuff that doesn't have a large slathering of approved views slathered over the top.
Funny that youtube is basically the opposite of google's search which will often only show you results that are relevant to top/recent stories making it very difficult to get results about anything else.
It doesn't even have to be that recent in some cases. Searching for "Sandy Hook Elementary" along with just about any other search term is going to give you a page with results about a shooting that took place a decade ago. Searching for a proper noun when they are the top story of the day is pretty much only going to give you that one story.
I use DDG now but I wish the search tools weren't just last week, last month, last year but greater than a week/month/year so I could discard more recent results. It's quite the puzzle to find ways to remove things, as using the -search-term-I-don't-want or +term-I-do-want is increasingly less reliable, in my experience.
Yeah, google pretty much ignores search modifiers these days and DDG (which I also use) routinely fails to understand what -term means. I still dream of the day we have a search engine that supports regular expressions.
There was one, back before the great clearing of interesting search engines that Google brought about. I forget its name but I used it for code searches.
I used to love that about Google News in the early days. A search would give you results from all over the world, with different outlooks and perspectives. I found it enlightening to read against my own selection bias.
Then they removed that value point and aligned what they presented with my existing bias - at which point I stopped using it.
It looks like it's owned by a Russian company. Does that affect what it allows you to see? I notice for instance that a quick search for "War in Ukraine" returns top results from Russian news outlets calling it a "Military Operation".
Of course it affects what you see. Search engines are a form of soft power that states can't afford NOT to influence.
As a user of them, I make sure to use multiple engines (preferably with distinct indices and jurisdictions) when I research something essential that affects my long-term future. This is because I distrust how politics is done (industry lobbying).
Look at how Larry Page disappeared off to Fiji and just dropped out of the tech and business world. That to me says a lot about what's become of Google.
>As a user of them, I make sure to use multiple engines
You may be interested in our alternative YouTube recommendations. Search a channel to find a list of other channels making similar videos. For example, our recommendations for How Money Works include other established econ focused channels like Economics Explained and Money & Macro but also smaller more obscure channels like Wall Street Millennial:
I’ve recently started using yandex lately and have say I also find it refreshing. My favourite is the image search I just find it superior to other search engines.
I absolutely understand the annoyance-factor of "Pinterest-itis" (though it seems that either they've dialled their "you must absolutely register now in order to see anything"-behaviour down a little bit again, or maybe it's just that half-forgotten userscript I installed at some point – either way it's currently not that infuriating for me), but given that e.g. direct image indexing seems almost non-existant for Instagram, being able to stumble across some images from there via the detour of Pinterest is better than nothing I suppose. (It's only infuriating if somebody saved an image to their computer/phone/… and then manually re-uploaded it to Pinterest again, so that the source URL of the image is then missing…)
Likewise if the original image has already disappeared from the net – finding a copy via Pinterest is still better than finding nothing. (Interestingly Bing's search can serve a similar purpose – compared with Google, it feels like Bing is much more reluctant to remove 404'd images from its index again. While those broken results are annoying, too – and unlike Pinterest neither Google nor Bing store (at least publicly) full-resolution copies of those images – at least you can try plugging the preview image into reverse image search and with some luck maybe find an alternative copy of that image that has survived elsewhere.)
And finally, the keywords added by Pinterest are possibly also helpful in making images more discoverable. (This might be no longer that relevant now that Google is also running image recognition on the images it indexes and no longer has to solely rely on the hope that any text near that image or the rare "alt"/"title"-text is actually relevant to the image contents, but it might still help a little.)
Just about anyone not using Google (Startpage) or Brave uses Bing, since it's the source of just about all indie engines' results. And even Brave's image results come from Bing for now.
Afair this happened after New Zealand Christchurch shooting. YT was full of POV videos of mentally ill idiot going room to room killing people. As a quick fix/backlash the very next day YT search was totally broken and stayed that way.
I've tried everything I can think of but it seems that news on Youtube is now much like watching television. If there's a trick I don't know it, though it looks like Yandex will have a new customer.
Yep, I confirm that is my experience as well. Google is starting to inch towards the television industry. In all honesty Im not watching any TV these days but i imagine its some advertising and snippets of content. Youtube without membership or ad blockers is just like that if not worse. From how if started to what is has become it’s a long way…
Yup. I now pretty much use YouTube search only to call up a video where I have the YT ID. Everything else is a waste of time, and I just use DDG search if I want to find something without the ID in hand.
How management thinks it is a good idea to make their products useless by letting a bunch of self-aggrandizing "analysts" or marketers or whatever attempt to manipulate their customers/viewers is beyond me. The fact that they are all paid to fork it up so badly is just mindboggling.
Myspace was the place to be until one day it wasn’t. YouTube is a behemoth but it too could fall one day. For now the amount of information it has is just too valuable but it is becoming less enjoyable to use the site all the time. On iOS I don’t have an Adblock and am forced to watch countless ads and then When I watch the ad the video fails to load. So I refresh and sometimes that works sometimes I need to do it several times before the video will play. It is a shame they have gone so downhill over the years for us old enough to remember the beginnings.
As a regular user, my expectation is that the very first results from a search would be from matching the title string (possibly after trying to match the unique video id seen in the url), only afterwards would results from keyword matching and other such metadata show up.
Maybe I'm out of touch but I don't think this is an unreasonable expectation.
It's not unreasonable, and it's exactly how it used to work until Google started messing around with inserting ads into every aspect of the site (they very much want you to you click on the ad-laden videos you were NOT looking for). They changed the default and relegated it to be a search modifier.
"I get 5-10 results for any given query before Youtube gives up and just offers me recommendations rather than search results."
I search from the command line using a short shell script.^1 Perhaps it depends on the query, but I consistently get more than 5-10 results. For example, I might get 26 reults with the initial query. By continuing the query repeatedly I can usually get over 1000 results total. YouTube search results are delivered as JSON so I wrote some simple, quick and dirty utilities in C to reformat the JSON to TSV with only the details I want. The workflow and example script below use those utilities, i.e., yy059, yy062 and yy064.
#!/bin/sh
HOST=www.youtube.com;
case $# in :)
;;0)
KEY=AIzaSyAO_FJ2SlqU8Q4STEHLGCilw_Y9_11qcW8;
MIN=\{\"context\":{\"client\":{\"hl\":\"en-US\",\"gl\":\"US\",\"clientName\":\"MWEB\",\"clientVersion\":\"2.20210916.06.01\"}},\"continuation\":;
TOKEN=$(yy059|sed -n "s/$/}/;s/\"token\":/$MIN/p"|tail -1);
URL=https://$HOST/youtubei/v1/search?key=$KEY
curl --http1.1 -A "" -H "Content-Type: application/json" -d "$TOKEN" "$URL"
;;*)
S=$(echo "$@"|sed 's/ /+/g');
URL=https://$HOST/results?search_query=$S;
curl --http1.1 -A "" "$URL"
esac;
I also have a short script that retrieves all video details for a given YouTube channel. These results are also delivered as JSON. I use search to discover interesting channels, then retrieve the full lists of videos for those channels.
Absolutely agree that YouTube search got neutered.
Yup. Search reveals two or three relevant videos but from that point on the search results are indistinguishable from the regular feed. And I get the same video recommended to me all the time. Sometimes back to back!
I feel like I also learned the other day that Google’s search is bad like this too. That it’ll say there’s millions of results and just stop after like 400.
An unqualified search reports (at this writing) just north of 30 million results (30,000,229). It will display only 34 pages' worth of results, I believe 1,000.
As discussed recently on HN, the match count is useful information even where all matching results are not displayed as it indicates whether or not a query is generic.
It's gotten so bad the past couple months. All I get now are maybe 2-3 videos maybe of what I want and everything else is some past shown BS that's not even related or recommendations. They want to become tiktok so bad.
Edit: Add to that the increase in number of ads per video and their length...ugh
I do exactly this. Google and Duckduckgo/Bing all provide better Youtube results than Youtube. I kind of hate the UX of platform now. But there's still so much great content on there from it's better days.
So many smart people there and such an idiotic solution - this whole short sighted, mind narrowing, engagement "boosting" recommendation system must be trained on attention horizon of a fruit fly.
It's probably companies advertising their products getting salty their new movie or something got 90% downvotes and YT is all too eager to please its ad overlords.
YouTube's search has never been as good as Google's for finding videos because Google search knows about the video and also an entire crawled web of links and references.
The biggest problem with this approach is that it takes a lot of curation to get your recommendation in a decent shape and it takes you one accidental video view to have your feed be full of uninteresting content related to that video. This asymmetry is the root of all the recommendation system problems.
Similar on Facebook: I click through to a scam product to gather details and send a complaint to ASA (in UK). I then say "don't show me stuff like this", block the "story", block the advertiser.
Now consistently I get adverts from Facebook for that same product from other scammers, no matter how much I block them. Presumably because that advert 'engaged' me for the longest. Eugh.
Incidentally, ASA responded that it was not a UK company ... which sounds like "we don't care about scam advertising we're just here for the paycheck". The company were advertising on UK Facebook and held a UK registered trade mark.
Facebook cares so little about scam content I'd regard them as actively complicit in the scams by this stage. Reported a scam as blatant as a Nigerian Prince (crypto product with "guaranteed income" of thousands per week which is "recommended by Martin Lewis" who actually took legal action against them over fake recommendations, according to a fake news website which rips off BBC News web design) to them and the feedback widget later informed me it had been reviewed and found to not be violating their scam policy. And yes, I got more of that type of website in my feed due to "engagement", which I suppose at least wastes advertiser dollars...
I'm sure Facebook advertising is a waste of time now, it certainly is if anyone behaves the way I do on the platform:
• I've conceded that adverts are going to show up in my feed regardless and most advertisers can't be bothered to target properly so are mostly irrelevant
• I take great pleasure in "engaging" with the obvious scams and shitty crypto/NFT ads in the hope it's eating their ad budget
• this makes the adverts I see extremely irrelevant in a reinforced feedback loop
I assume you're right, which puts us in a position where:
- search is neutered/broken, so finding new interesting videos is a chore
- recommendations need to be manually combed regularly to remove dumb videos
- watch history also needs to be combed to remove potential false signals
That's a lot of curation to get somewhat decent recommendations. Not saying we have a choice, it's not like Vimeo will make a comeback to save us, but that's a pretty bad place we're in.
YT has hidden tags that tie it to your account after viewing a video, removing the video from played/history does not affect this and has been an issue with YT for nearly 6-7 years at this point.
I don't know how you can reach it from within YT itself, but I know that if you do a datadump of your account, you can see it in one of the text files within.
It, unsurprisingly, has a history of _lots_ of different things, you'll be shocked about how much Google knows about you. Probably things you yourself have forgotten.
That's why although I do subscribe to YouTube Premium (the only Google service I still use), I only use it from an account that is not used for anything else, on a single tablet that is not used for anything else, and pay with a card that is not used for anything else.
It works for me and other people I know. Especially useful when I'm baited or my finger slips. Paired with flagging recommendations as "Not interested" or "Don't recommend channel" it's very effective.
This right here is exactly it: my number one use of incognito mode these days is when clicking on YouTube videos that have been linked from somewhere else where I'm trying to keep their content out of my recommendations.
Can this not be at least partially solved by relying more heavily on a user's feedback(like/ dislike buttons, numerical rating systems, and any other feedback mechanisms) to better understand preferences of the viewer? Along with simply being more patient and diligent when trying to discern actual user desire?
It seems like the problem of asymmetry in recommendation platforms is more a choice from a design perspective than a challenge yet to be figured out.
I'm far from an expert tho so just a thought from a laymen
Vimeo will never be effective because they've already issued a statement that they neither want to be like YouTube or host the same variety/types of content.
Same thing with task specific videos I watch. I just wanted to see how to remove the starter from my van. I’m not a car enthusiast. Please stop polluting everything in my recommendations based on these one off videos.
Yeah the recency bias is really annoying to me. I'll get really interested in some new topic for a bit and watch a bunch of videos about it and it seems like YouTube takes that as a signal to bump other topics out of my recommendations, even if they're things that I've consistently been interested in for a long time.
I wish I could somehow "save" the state of my recommendations profile so I can revert it if it goes off the rails.
You’ve nailed most of the curation, but the way I get new recommendations is that I search out new stuff on YouTube. At this point if there’s any subject matter I’m interested in that’s not political or social issues, I’ll do a search on YouTube in addition to my other searches (Google, Brave Search, Reddit, etc.) to see what I can turn up. This often nets me additional recommendations down the line, often when there’s a new video on that subject posted when I’m not expecting it.
Low quality videos also get nuked with prejudice from my history.
> Some might say that you can still use search, but that has been crippled to include unrelated recommendations after one page of results. Not only that, I want YouTube to help me discover new content. When the recommendation system works and it shows me new kinds of interesting stuff, it's superb. TikTok is eating YouTube's lunch in this regard.
this and it's going too far, since Youtube is now polluting people's search with very questionable recommendations. For instance, I'm into CGI and Youtube search kept recommending me some random footage about a female streamer which behaved and dressed very suggestively, that already had millions of view, what does it have to do with Shaders and Raytracing? Why the hell does Youtube want me to look at that?
Google have a ton of other ML features about you that they fit to the model and so you have been profiled to be a bit lecherous.[1] Do you feel good about that?
Think how it must feel to be profiled to be a crook or a bad credit risk or something else that actually impacts your life and is false. It's a little worrying.
[1] Other people like you like this so it is assumed you do also. Without "lecherous" ever being mentioned or even anything like suggestively dressed appearing as a model feature. It is really, really easy to accidentally train a statistical model to be wildly prejudiced. It seems significantly harder for us to categorize it when it happens as "other similar decisions are made by shockingly violent racist and evil humans and this model and those using it should be treated with the same or more derision as they claim not be so and ought to know better." Significantly harder for many good reasons as well as bad ones.
They addressed the "political rabbit hole", but the powerful nature of algoriths to change people's lives remains completely unaddressed. People's tastes, views, behaviors, career paths, music tastes, and personalities are being influenced on a mass scale by totally black-box algorithms that are manually tweaked by unaccountable employees. People's psychological vulnerabilities are taken advantage of, as you can clearly see with the big rise in people faking Tourette's syndrome because of TikTok, that psychologists are now having to account for in their diagnosis and screening.
"A modern clinician needs to remain abreast of social media sources as knowledge of media content is essential in managing patients in the current environment."
We've all met people that adopted wholesale the mannerists of this-or-that online influencer; gestures, tonality, jokes and all. It's creepy as hell to see, and sure, and algorithms are reinforcing this phenomenon. An "obsessed fan" is YouTube's ideal viewer, since his engagement will be through the roof.
Spotify is unprofitable, so they allow the biggest artists to pay to be promoted by their algorithms, which shapes the world's music tastes.
Fat people are being recommended mukbang videos.
These algorithms can be studied, as TFA does, and you can try to hold these companies accountable, but there's limited utility in doing scientific studies of algorithms that get updated multiple times a day.
In today's world, manmade algorithms from a handful of huge tech companies are likely exerting more influence over human evolution than nature itself is!
> Why the hell does Youtube want me to look at that?
One answer is that some assumptions that YouTube makes about the viewers is based on very broad stereotypes; the geek who’s into CGI also has a sexual side that they are about to discover through YouTube.
It's not only recommendations that is broken. YT has a very sluggish frontend (I have to suffer it particularly in my x260 laptop), and for us multi-lingual people they translate stuff and you can't know in the search results which video is in which language.
I don't like YT quite honestly, and it's been a while. I understand it's a business but you don't have to provide such bad experience to make money.
For searching videos I'm more and more relying external search because with YT I just get suggested crap, like with google. I'm beyond tired with this.
Recommendations for old videos drives me crazy. There is a magic the gathering YouTuber who releases daily content that I watch (at least in the background) daily. I want to see the current state of the game and the meta being played. Videos of his from two years ago do nothing at all for me. He has years of content, and I don’t want to actually block his channel from recommendations so I guess I just put up with it.
Similar story with a guy who reviews indie games daily. I want to see the future (potential) indie hits not the games from years ago.
I've had this, I take a couple extra seconds to specifically say don't recommend me anything from this channel, and then my suggestions get bombarded with that style from videos from different channels
The most blatant was a specific "Get robux" video that appeared constantly on my feed from dozens of different channels but that triggered this extra engagement in the form of blocking, reporting and disliing
I’m pretty sure it’s exactly this, just time spent on page no matter what you’re doing even if it is scrolling past dozens of recommended videos trying to find something half decent.
I've done this aggressively in the past. The effect is so poor it's absolutely not worth the effort. I honestly couldn't imagine writing a system worse than their current recommendations. Straight up random would be a better system in some situations.
I never do this because I have no idea what it'll do. If I tell YT I'm not interested in a video will it hide the whole channel from me? Even secretly unsubscribe me as it randomly did multiple times?
If I had to describe YouTube's recommendation algorithm I'd say "undefined behaviour". It might recommend the same uninteresting video every day, but also hide an interesting, subscribed channel's videos for months even though I liked almost every one of their videos. It might mix videos I already watched once with completely unrelated search results.
At this point Google could probably improve their algorithm just by flipping a coin.
Appreciate it can never happen but I'd love filtering rules for YouTube's home page. "Nothing I've watched in the last N months, nothing older than N months, nothing from the blocklist, nothing from X about Z, nothing from Y after YMD", etc.
None of this works for me. I get constant recommendations for (what I consider to be) asinine lifestyle videos, and music videos completely unrelated to my interests.
It doesn't matter how many times I say I'm not interested in K-pop, it doesn't help.
I'm regularly recommended videos from Richard Wolff that I've downvoted and flagged as not-interested. It's so annoying that I've often thought about making an entirely new account but the amount of work to re-subscribe to all the channels isn't worth it. I really wonder why YouTube recommends you things repeatedly that you dislike and switch away from. Something is broken in the algorithm or it's doing as intended. Either way, it's a frustrating user experience.
Second you notice it surfacing things you don’t want to see because you accidentally clicked a rec you have to go into history and delete the video that brought forth the infestation.
So far I’ve found it does stop showing you if you remove what caused it from history.
>> The trick to improving YouTube recommendations is to regularly prune uninteresting videos and channels. Eventually this should start surfacing videos you're more likely to find interesting.
Unfortunately, Capitalism doesn't work like that :-(. Back in the old days, Google searches used to be organic searches in the genuine sense, and that was the best Google we ever had. But today's Google is the Big Tech Capitalist and so will be their algorithms. The organic logic should follow your prunes and show you what you want but the capitalist logic will only show what they're paid to show to you! (i.e. propaganda, politically motivated content, inciting content, etc. etc.).
They have to be capitalist in order to be a profitable company. The problem has more to do with their size and monopoly status. They have the whole market and no credible pressure from consumers threatening to leave for other businesses and so they just do whatever they want which is chasing internal trends and politics.
I have done this for recommendations, and it now only shows videos I have recently watched, usually the 3-4 previous videos I watched.
If I click on anything I wouldn't usually watch, it is then the only thing that goes in my recommendation for a good 2-3 weeks.
I watched one dashcam video the other day that someone linked me, which had the tags Tesla and Elon Musk.... Now the only thing I get in my recommended is incredibly strange videos about Elon Musk and Teslas.
> I'm never interested in rewatching stuff I've already seen.
thats a weird position. The whole point of YouTube (used to be anyway) was long form content. YouTube is not new anymore. You really have no interest in watching some good 10 minute video you saw 9 years ago?
I also don't want to be recommended videos I've already watched. If it's something I might want to see again then I save it in a playlist, I don't need YouTube pestering me to re-watch old stuff.
YouTube already knows how to fix it. All they have to do is revert to their pre-2016 recommendation engine. They panicked when they along with the other socials thought they somehow got Trump elected and they kludged their algorithm to death. After that it’s just been one ham fisted patch after another. Personally I think all of the people who knew how the previous system worked are gone by now, pushed out by demands use the algorithm to shape user behaviour. Now it’s just whoever doesn’t have a moral problem with such scummy practices which is likely a basket of juniors and ideologues that barely know how to glue code together.
I didn’t like when they removed the dislike counts, and I still don’t like it.
When I look for videos of folks working on Subarus, I don’t want to have to manually scrub through the video to gauge the quality. Before I’d just look at the ratio and if I was feeling generous I’d preview the frames to see if they had what I wanted.
Now it takes me at least 2 mins longer to get to the same judgement which begs the question of why I’m watching a video when I have a book on my lap and a car waiting outside.
The problem with the dislike button seems to be that nobody agrees on what it should be used for.
In previous HN discussions, it's become apparent that some people use it to say "I am not interested in this video".
That is of zero use to other users. The video isn't necessarily bad, it's just on a topic you don't care about.
If you are instead using it to mean "this video is factually wrong" or "this video is obnoxious" that might be useful.
If you are using it to say "I don't agree with this political viewpoint" or "I don't like this movie/superhero/game/TV franchise", then that's also probably not that helpful.
Apparently you cared enough about it to start watching it after reading the title. A video with "Rust" and "Tokio" in the title will probably be unwatchable if it has a like to dislike ratio of 1 to 3, even if the title signals you that you want to see that video. But since there no longer exists this ratio, you'll have to watch it partially only to notice that it is bad. And then dislike it (and then have this dislike action have no effect whatsoever in future recommendations, apparently).
I absolutely hate it that they removed the dislike counter, it was so valuable.
I would guess that for a given class of videos, there's going to be a fairly regular amount of dislikes that are going to be because of behaviour like "I am not interested in this video". So if you come across a video in that class of videos (e.g., car repairs), and it has a much higher ratio of dislikes to likes, it's probably a good indication that there's more to those dislikes than just "I'm not interested" -- more likely the video is factually wrong, poor quality, etc.
If it's a "how-to" video the dislike counter would show that this method doesn't work anymore or never worked well in the first place
Especially for software or websites, for some reason I can't find niche stuff easily on search engines, but there's an Indian guy that just tells you how to use a website. If the website doesn't work anymore I'd like to see a decreasing thumbs up ratio.
I did find a website that let me watch a video with violence in it without logging in to youtube. It looks like they just mirror the video you access from youtube. But this kind of a site will easily get shut down with some legal notice from YouTube.
I actually want a thumbs graph, to see if the video has been gaining likes or gaining dislikes recently.
If only we had a way to spend billions of $¥€ on putting the greatest minds in the World into a single company who could solve this at the drop of a hat.
It has to be control, right, surely Google are competent to fix this overnight of they wished?
The _only_ problem with dislike button was that if affected ad buyers (game/movie publishers) and YT corporate - the worst crime of all.
>On December 13, 2018, YouTube Rewind 2018: Everyone Controls Rewind became the most disliked video on the video sharing platform, with 15 million dislikes rapidly surpassed the music video for Justin Bieber's song "Baby"
You can dislike random people, but Susan hearing her pupil star employees content is shit is where you cross the line. Especially when it happens again next year: "YouTube Rewind 2019: For the Record"
How much do these factors matter, though? Similar arguments can be made about uselessness of the like button. Some press it to indicate good, relevant content, some press it to support the topic, some like factual content, some use likes for later reference in "liked videos".
If you pivot a bit to Facebook videos, you can see heavily "liked" videos where huge portion of top comments are literally some form of "waste of my time", "idiotic video" and so on.
It’s not of zero use, even if it’s imperfect. The perspective shared in the comment you’re replying to is a great example of this, unless you deny the validity of their reasoning or experience.
I think the solution would be to just make the down-vote do nothing but increment the visible counter, and also make it so that users that can't comment if they already voted. This allows people to simply signal their disapproval without fighting repetitive flamewars in the comments or suppressing content they don't like.
Or maybe if you're running a forum where you want to promote controversial content and hide boring content, just make the downvote act identical to the upvote, so users are incentivized to downvote and argue if they think it is argue-worthy and just ignore if they think it is too boring to discuss.
If people are using "dislike" to indicate that they're not interested in content, particularly in a context in which there are few other ways to do so, then that is what the button means.
You might extend that more broadly to any aspect of a UI/UX.
In a past life I designed some elements of a content moderation / feedback system. Among the more amusing aspects of that was encountering people who told me that I was using the system I'd designed wrong.
As an aggregate, having a largely positive or largely negative feeling about something still provides good information to whomever's asking. It may not be ALL the information, but it's still hugely valuable.
It's not dissimilar to the buttons asking if you enjoyed your washroom visit. Sure, it seems vague and stupid, but if one of your 30 washrooms is suddenly reporting 50% "dislikes", you know something is wrong.
I know people get upset about the political implications of She-Hulk or COVID or Rings of Power dislikes, but surely one would think the positive information for the content creator outweigh the negatives here. There were times when all of Justin Bieber's content sat at around 5%. This was probably an annoyance for his PR team, but I doubt many of them thought the feature should be removed entirely.
I know, let's try using the provided search filter feature to sort by rating. Surely this will give some useful listing that prioritizes user deemed quality!
Testing this with "Subaru repair" (no quotes), the top result has 11 views from 2 years ago.
I used to calculate the ratio of likes:dislikes, and a lot of my hobbies were in the cool 1,000:1 range. Now I do the same with views:likes; 10:1 isn't uncommon for some niches!
I believe that uses a separate database that was seeded from scraped youtube data but now it's only updated by those who use the addon. So it would have a much smaller sample size for recent activity.
I'm not sure about this because when going through Youtube shorts I regularly see videos with +100K dislikes, and any "viral" video has dozens of thousands of dislikes. I don't think this many users installed an extension to bring back dislikes and happened to dislike these particular videos.
The extension estimates the "true" dislike count based on the like/dislikes it has in its database. E.g. if the database contains 100 dislikes and 900 likes, but the video has a million views, it'll estimate 1000 dislikes. I'm sure there's some cleverness in there for videos where it doesn't have a tonne of information yet.
I agree that removing the dislike count was dumb, but why would it have anything to do with quality? It's "dislike" as in "not like", not "low quality". I hit that button if I don't like the video for any reason, most frequently because it's boring, subject doesn't interest me, I don't like the person or they say something I don't like. I don't hit it if the video is merely low quality.
I haven’t ever used the dislike button. It never functioned as a “don’t give me more of this”. I did use the “not interested” option to do so, but as others have said “dislike” is ambiguous, not to mention the opportunity for abuse.
How is this a useful response to the perspective shared in the GP?
Even if you don’t use it, it’s still a useful signal. Are you saying this isn’t the case? For example, did you find the dislike ratio to be a misleading signal, when it was around?
I shared to express that it wasn’t a signal universally understood. I never saw the dislike button as anything other than people hating on a video. There’s no context to it. You don’t know if the dislikes are people who don’t like the content or the author or the subject or the channel. In other words, for me it was never a useful signal. It only reflected how terrible the internet can make people behave.
I understand if my perspective is a minority opinion, but I think it’s a valid one.
I found the like/dislike ratio to be very useful, personally. Since it was removed and I have no signal as to whether or not the video is any good, the number of crap videos I end up playing has increased substantially. It's really annoying.
Yes, certain videos would get bombed -- but that, too, tends to be obvious and tells you something useful about the video: namely, that it's controversial in some way.
The reason dislike count had to be removed is that it enables hate campaigns against things (e.g. the recent controversy around the trailer for the live action The Little Mermaid movie, to name one among many).
Organized hate campaigns have a disproportionate representation in review sites, so they break rating systems. Also, the dislike count acts as a positive signal to hate-bombers ("we're winning", "let's make it to 50k dislikes") more than it helps anyone actually engaging with the content in a more healthy way.
One could also argue that the like count also creates a pernicious culture around subscriber / watch-count ("help us reach 1M subs!") that negatively affects young content producers and adds nothing of value to the world. But that's a much more complex conversation.
Perhaps this is just democracy at play, doing what it's meant to do.
I've found that people who interpret a high number of dislikes as "hate campaigns" are usually the ones who agree, if not, applaud the content being disapproved of by a larger group who is frustrated with the subject matter and is using their voices to show their disapproval.
This is just as valid the other way around, where you should equally have the opportunity to express your disapproval of a video on the basis of political issues if you so please.
Review-bombing certainly exists, but I don't believe it's the norm when there's a high amount of dislikes out-ratioing the likes.
And even in the case of review-bombing: to me, it's a clear indicator of societal sentiment around a topic or aspect of whatever it is. I think we should be paying more attention to it, and spend more time publicly discussing these moments, treating high dislike counts as a valid point of view.
Unfortunately there is a certain political leaning who has grown quite comfortable with gaslighting those who hold and express majority opinions that aren't popular with late night TV hosts and celebrities, who further encourage hatred, violence, suppression, and ridicule towards this populace as they watch the wealthy and powerful deconstruct every bit of their culture and values, where the only replacement is woke garbage that they've clearly expressed not wanting through their votes, and yes: dislikes.
Democracy is eventually unfriendly to those who twist it into a guise for fully endorsed mob rule — and it will bite back.
This reminds me of how games (coughLeagueOfLegendscough) match me up with a player that I literally just reported for harassment, griefing etc. in the previous match. Like, there weren't any other possible players to match me with out of the many thousands? I don't mind waiting a few more seconds to match up with someone else!
I'm getting a lot of responses as if I expect the player to be totally blocked from matching with me, but that's not really what I mean. Even a basic "temporary match score reduction" would work. I would also expect a cooldown system where the reduction caused by reporting no longer applies. Arguing against the idea "because it creates a prisoner's island" misses the subtlety of a well-thought-out implementation where reporting simply lowers the matchup "rank" for a while, and serial-false-reporters have their reports basically nullified/ignored.
Overwatch had a feature where you could avoid players, but a top sniper was avoided by enough top players that he could never find a match. The matchmaking system could never find him a match because almost all potential matches had at least one player on each team that had him avoided. The devs eventually reworked that feature so that he could play the game. Sometimes seemingly innocuous features have weird edge cases!
Even for an average player, adding an avoid player feature would ultimately make matchmaking take longer. It necessarily reduces the number of valid matchings of players, so the system will only be slower. My guess is that average queue time is an important metric that management does not want to increase. It’s an especially tough sell because if you really wanted fresh players in your game, you could always not queue for a few minutes and you will almost certainly get all-new teammates.
Isn't it fair that if every player you play against decides they don't want to play with you, you shouldn't get to play? Obviously something is wrong with you. (A 4chan style let's-make-this-person's-life-hell attack notwithstanding)
It wasn't that this player was disrespectful/racist/etc., it's that they were so much better that playing against this player was almost a guaranteed loss.
Maybe the system was different back then, but now Avoid will only avoid the player as a teammate, they can still be on the enemy team. So if someone is too good, there's no incentive to avoid them.
Maybe that means he won the game and it's time to do something else until someone with similar skill comes along? Or maybe there should be a handicapping system.
There were players of similar (and higher) skill — but they didn’t want to play with him because he was “annoying” to play against.
Which is a normal part of a competitive multiplayer game, except that people got the option to avoid him (rather than learn to beat him), so many took it!
If they were playing the game to have fun, it's an entirely rational decision to make, even if it's unfair to the good player. In fact if the developers were optimising the game for fun they should have kept the feature as it was. In my opinion it should be more important to optimise games for fun than for fairness, but I suppose the developers don't share that opinion if they changed it.
Optimizing for fun is kind of impossible, at least in the sense of making a fun but unfair game. Fun games are inherently ones where you're matched with players at your skill level, and you're actually being challenged while still getting kills. Going 30-0 is only fun for so long (and for the people who don't get tired of it, they often go and download a cheating program).
I mean, this are competitive games after all. Racing against Max Verstappen on iRacing is not an issue, it's what makes those games interesting. You don't ban Federer/Nadal out of tennis, you let people compete to raise the bar.
Your parent comment specifies that the reason so many people were avoiding this guy was that he was good at the game. I wouldn't agree that that should mean he's not allowed to play.
Right, it sounds like Overwatch's implementation was poor. I don't expect to have the player "unmatched" forever, or even completely unmatched for that matter. Just lowered in the "algorithm", probably including a cooldown factor so the "de-rank via reporting" is gone after some time.
Like I said I don't care if matchmaking takes longer (usually it's only ~30 sec). The pool of players is utterly massive, and ranking one player down a bit would make a trivial difference.
Yeah, I could wait a couple minutes and not queue up. But that's making it my problem that I had to deal with a toxic/abusive player. I already did my part and reported them, but now I also have to just not play the game for a while to make sure I don't get matched again? Indeed, my suggested solution of lowering their matchmaking "matchability" to me would benefit the community. Not to mention, the effect could be nullified if I am spamming unfounded reports against players constantly.
I kind of just assume that a competitive multiplayer game under active development for the past 15 years would have functionality like I've mentioned above. Maybe it does, and it just hasn't worked well to prevent me from matching up with toxic players multiple games in a row.
Overwatch's current avoid player system has a limit of 3 players, and only for 7 days per player - and no way to extend unless you run into that player again. Also, it's only avoid as teammate (which was alluded to in the top sniper story, but not specifically stated).
SPlatoon 2 had the same issue where people would block the best players to get more favorable matchings. It was reported to still mostly works that way for lower ranks, but they completely removed the effect for the highest tier (X rank).
Inter-user features are always delicate to balance when it comes to competitive games.
This has been discussed by the community for as long as I've been playing (2012ish), and rito has said they won't do it because it creates a "prisoners island" type situation where all the mean players end up playing together and player retention drops. They argue its much better to attempt to rehabilitate rude players, but this was back when riot Lyte was doing his thing. I don't know if they spoken on the subject since.
That's wild. Their reasoning (or even finding, they should have the data) is that if you isolate the annoying players away from the general population, people overall play less? I'd intuitively expect the opposite, you play less if you constantly have rude players around.
Or is it more of a thing of "60% of our players like to harass people, if we only pair them with other harassers, they don't have any fun and leave"?
My memory isn't 100% but they essentially said that most players are actually unlikely to be toxic with any regularity. Typically players will have a bad game, and they will lash out. They said that for players that are habitually toxic, a short ban (I think 2 weeks is standard) significantly reduces the chance of reoffending. For players with repeat bans for harassment or aggression they get permanently banned or eventually quit because of frequent longer bans.
Most of their communication should still be online in some form whether on their website or reddit. All this is just from my memory so subject to misremembering.
> They argue its much better to attempt to rehabilitate rude players
That's fascinating. It was rude and nasty players that drove me away from online gaming completely. They make it seriously unfun, and if it's not fun, there's no point in doing it.
I've noticed this happen in other online games. So these days when there's someone I don't want to play with again I will report them but also wait a few minutes before queuing for the next match to give that player time to get matched with others.
YouTube doesn't seem to understand my preferences, at all. It's far, far too biased toward what you recently watched. It basically doesn't work and certainly doesn't "learn" anything.
I watch it daily, I have never listened to any hip hop, I don't watch sports nor trending topics, yet every single day it suggests football videos, MMA, the latest hiphop hit, and celebrity crap (imagine the Kardashians) among my other interests.
I click "don't recommend this channel", or "I don't like this video " every single day, I have an extension to make this convenient so I train it dozens of times a day yet it doesn't bloody care.
I pay for Premium because I watch it on my iPad and smart TV, but I'm planning on telling Apple to fuck off, buy Android everything and sideload a system wide adblocker so I can stop paying for that terrible service.
This is the inconvenience it forces upon a paying customer. What an abusive relationship.
> I watch it daily, I have never listened to any hip hop, I don't watch sports nor trending topics, yet every single day it suggests football videos, MMA, the latest hiphop hit, and celebrity crap (imagine the Kardashians) among my other interests.
I don't have this experience at all. YouTube recommends me pretty much exactly what I've been recently watching. I watch a lot of long form interviews and tech videos, and my recommended feed is entirely filled with more of the same, which are of decent enough quality for me.
I used to have the same experience of YouTube recommending just more of whatever I just watched, and at most recommendations would stick around for 6 weeks before it forgot about a subject. That was until earlier this year when I watched some videos from MrBallen (camp fire stories based on real events), and ever since then I have gotten my recommendations filled with crazy conspiracy stuff.
It stopped recommending his videos months ago, but lasting damage has been done to my profile or something like that because the conspiracy stuff sticks around and has crowded everything else out. The old behaviour of fixating on something I just watched for 6 weeks no longer happens.
I suspect that certain types of videos/subjects are able to leave a massively oversized weighting for a category on your profile that everything else get's completely dwarfed by.
Exactly the same experience here. I kinda like MrBallen's presentation and I know I can't take his stories at face value, but apparently it's so close to conspiracy stuff that it just overwhelmed my recommendations. I guess part of the problem is that conspiracy content has very high engagement among its audience, so youtube is biased towards it.
Ultimately I had to go back and scrub his videos from my history and even then it still took a lot of effort saying "not interested" to conspiracy stuff until I no longer got those suggestions.
If you want to fix it (and I agree it shouldn't be this painful, but you deal with what you got), you can go back in your history and remove the videos from MrBallen from your watch history.
Not parent but same experience. I watched MrBallen's videos as "creepypasta based on real events", and I guess that is a pretty similar vibe to conspiracy theories. Never watched any of the conspiracy content and still got it in my recommended videos for a long time.
Nope, never watched any of the conspiracy videos.
I did however watch quite a few of MrBallens videos, specifically all the cave/diving ones, they are just great stories.
> YouTube recommends me pretty much exactly what I've been recently watching.
It depends. Maybe I wasn't clear, most of the recommendations are fine (not great, but still pertinent), but sections like "New to you", or "Just uploaded" or even the search suggestions are full of trending videos that I keep telling it I don't care about.
Right now I just get recommended three categories of videos:
- 90% relevant to my interests but often stuff I've already watched. Nothing that strays a little further than the channels I already know. Keeps recommending stuff I've watched and liked months or years ago. Video #42 of a series I've seen fully.
- 9.9% useless trending bullshit. This month it's all about the "Why have we divorced" or "Meet our new baby" videos. I don't have children, I'm not even in a relationship. Google decided I would care about this.
I actually expect those sections to completely ignore your preferences. They're just advertising sections, so they reflect what YouTube wants you to watch.
Youtube does that because people like me compulsively re-watch content. Youtube probably needs a few user-visible knobs like "I like to rewatch stuff vs I hate rewatching stuff"
For me it recommends exactly what I watch - I've never seen any recommendation for stuff I don't watch.
The trouble seems to be it can't seem to find anything new. So if anything, I'd want it to branch out a bit and find semi-related stuff, but it doesn't.
You're not the first person to complain about YouTube recommendations, and I don't know why my experience is so different than yours.
Thanks. I should really dedicate a weekend to setting something like that up on all my devices. I'm willing to put up with bad recommendations if I know they don't get a cent out of my views.
Do you perhaps have some account setting where it doesn’t store info or track you? In other words, are you preventing them from understanding you?
I have a completely different experience. My recommendations are pretty decent, if recent-biased. You’re not on the “explore” or “trending now” page right? Obviously trending videos aren’t tailored to individuals
It's insanely biased towards popularity. I watched this series of like 40 videos on extremely large numbers. And not one time did I ever see one in my recommendations. You would think when you've watched 1-20 of something the algorithm would be smart enough to suggest 21 but no.
What's funny is that the super smart AI behind the rec engine in YouTube sometimes is so dumb it can't figure out that video #3 in a series with consistent titles should be followed by video #4.
Sometimes it just gets it wrong and skips one, other times it just recommends some other totally unrelated video even if you've been watching the same numbered series forever.
And it's got a short enough memory span that one day it'll recommend out of the blue, for weeks, video #57 of a series you've seen a couple years ago.
I wonder how many of the "algorithm's" problems are caused by a huge number of literal toddlers that just hit buttons at random. How many parents just sit their kid at an iPad for some time to do something?
It does do that for me, and it also regularly recommends videos with <1K views if it's from a channel I often watch. It even sometimes recommends videos with <100 views from channels I don't know if it's on a topic I watch videos on regularly. So it seems the algorithm doesn't do the same thing for everyone.
Somewhere in the middle for me, but what it recommends to me are generally channels I am already interested in. I think the only novel recommendation I've seen recently, that I enjoyed, was of some kind of classical Chinese music. Otherwise, I see a lot of absolute fucking garbage from the likes of Mr. Beast.
I feel like every recommendation service works this way - YouTube, Amazon, Spotify, etc. Whatever the last thing you looked at becomes your entire life's mission according to these systems. When I think about all the buzz words around machine learning, ai, and so on, it's completely incompatible with the reality I see. What exactly are all these engineering teams doing just to end up with this result?
Amazon might be the worst example though. Buy something like a bike rack for your vehicle and Amazon will recommend nothing but other bike racks... as though I need dozens of them for one car.
Youtube does have a great escape latch though: go to the History tab and delete your viewing history. Your recommendations start from scratch and you’ll get funneled into a potentially new corner of videos.
With that in mind, I think Youtube recommendations are quite good. Just do it when your recommendations get stale, monthly or so.
Well, I'm one to appreciate what we have available to us especially compared to the alternatives and temper my appreciation accordingly.
I don't know if I use a single service that gives me particularly compelling control over recommendations. Though I also don't find myself deep-diving my history so often that clearing it every few months affects my life, but I suppose I can see how that could be a setback.
My main point is that if the recommendations are improved by giving YouTube _less_ information to work with, that's a solid sign that the recommendation system is borked.
It'd be great if I didn't use my History to actually find videos I've previously watched...
I don't understand why recommendation engines just don't provide a reset button. It'd be better for everyone involved: content provider, content creators, users, ad buyers.
It appears to any end user that knows some basic logic and looping concepts that the advanced AI/ML/whatever recommendation engine is no better than looping over the list of recent content (videos, products, etc.) and selecting more content that has same keywords. If you remove the volume and distributed systems elements for large systems its probably a single SQL query joining 2 tables, history and products (or videos or whatever).
For the bike rack example, to make it "smarter" you could have a product type that is something like "one-time purchase" and have a linked product type that is accessories for the one-time product, like helmets, pumps, tire changing kits, inner tubes, water bottles, etc. This would be what a retail worker at a bike shop would do - ask if you need any accessories, maybe point out clearance/sale items. Thats it, no fancy engines/AI/ML needed. If the customer wants other stuff, there is a search bar they will likely use.
I chalk it up to Resume Driven Development and just the innate tendency of engineers over-engineering things because that's how our brains work.
Most preferences type systems don't seem to get the nuances involved.
I just watched a pleasant video about a particular video game.
Youtube now wants me to watch the most obnoxious youtuber they can find who also covers that video game.
No, actually the reason I watched that other video was because the person in it was nice and pleasant.... the recommendation is the LAST thing I wanted.
YouTube doesn't care about your "preferences." It only cares about what it predicts your behavior will be. If you tell your drug dealer that you're trying to quit crack, he's still going to sell you more crack.
Well at least it doesn't work like Instagrams where it doesn't do anything at all. I don't want to see hundreds of videos making videos of street vendors all doing the exact same thing.
Yeah, on a non-logged-in system I watch YT on, I watched exactly ONE gaming-related video, and now suddenly the "home" view is full of Minecraft, Roblox, whatever FPS shooters, etc... Like... I'm not suddenly absolutely obsessed with a topic because I viewed one video, guys...
It operates similar to saying to a spouse or in-laws or other family that you like some <collectable item> and they are the type that runs with it until you have to awkwardly tell them "No more". Every holiday, birthday, random "oh I saw this and thought of you" type of thing.
Honestly, I feel like YouTube recommendation is the _only_ good thing YouTube has lol. I often see a ton of complaints about the recommendations but I've honestly never had an issue. I know that yes, you click on one video and now all the sudden you get 100s of that type... but it sorta works for me since it seems like that "effect" wears off just as fast as it starts.
For example, I have a friend that's into cars. I watched a few car videos and then 50% of my recommendations were car videos. It works for me though, because usually those videos are genuinely interesting. I haven't watched one in weeks tho, so going to my home I see 0 car related videos.
I will 100% believe everyone though when they say it just doesn't work for them. Just wanted to share my experience.
Completely disagree. I think what is great about YouTube are the subset of creators that make genuinely good content, and especially those creators you follow consistently for a long time and really enjoy and get something out of.
The algos are useful if I follow so many creators that I can’t keep up, or occasionally I get some good unexpected content. But my favorite creators were all recommended me by other people, not algos.
Also, the algos just peddle in commoditized interactions like Likes and Watch Time and Subscribes. They can only optimize for meaningful creator-observer relationship so far as that’s measured by Watch Time, Subscribe, etc.
Not to mention some obvious problems how clickbaity the recs are for new/anonymous users or on videos with few views. Or the whole Elsagate incident (which no doubt continues to be a problem).
Overall I still like YT but I don’t think the commoditized algos are helping much. Just let me get recs from friends or search for what I want. Don’t try to get me to doomwatch easy content all day… and especially don’t try to get kids to doomwatch easy content all day.
I can certainly agree there should be a separate algorithm for under 13 year olds. I've seen a fair number of family or friends with toddlers and you check out what they're watching and 9/10 times its literal nonsense. 3D renders of spiderman jumping on a bed or in GTA just running around, lol.
Very odd and you can certainly get some bad content in the mix because they don't know what they're doing.
Yup, works fantastically for me as well. I almost never go to my subscriptions because every new video I want to see is right there on the home page.
I super love the categories at the top too. If I want all cooking videos instead of just 2, it's a single click. It's really neat to see it use machine learning to tell me what my ~15 top topics are. And it's entirely correct.
Most people don’t understand here that folks at YouTube have probably ran thousands of variations of their algorithms and their current system is likely the best yet for keeping highest engagement on average. The issue might be that majority of the population might love recency bias. Folks here are not representative of general pop but they get overrun by majority preferences.
That's valid, but a bit depressing. It means that YouTube will never be able to accommodate me. Hmmm... maybe that's the realization I needed in order to finally give YouTube the boot. Thank you!
Not sure what we are doing differently but for me it's pretty much random and useless. It seems to be a mix of mostly old videos from channels I follow anyway and random junk I am not interested in. I am sure I could find something new and interesting in there if I looked long enough but the signal/noise ratio is very low. Old account with thousands of views and likes/dislikes.
Do you prune your history of garbage? Every time I clear out all the click-bait, poor quality, and accidental picks out of my recent Watched History, I find the quality of the recommendations go up. Are you subscribed to some high-quality channels?
I don't prune the recommendations (since I don't use it) but my subscriptions are always carefully curated, about 50 channels I think. My subscription feed is all good, that's what I use as my landing point.
In all honestly, I could have a very low bar lol. I do believe it though. I've seen people post screenshots and they have a ton of nonsense recommended. I've rarely ever clicked "don't show me this" too.
But just look at that same behavior over time. Say you searched for car videos yesterday, and today you are searching for basketball videos.
Old youtube used to be very simple and recommend similar videos to the current thing you are watching, so you would get basketball videos when you were looking for basketball videos.
Now, it will do that for some but also put car videos in the recommendations as stuff you "might like". Not only that, it will probably recommend the same car video you watched yesterday because you liked it yesterday so surely you'll love it today too right ?
This only gets worse as you add more "unique" topics to your searches and follows. It doesn't matter what video I'm looking at, I get the same recommendations for all of them.
There is some incentive to want you to watch car videos in that situation, as someone with a recent automotive interest is an ideal person to show car ads to.
True. I do have premium though (hopefully that doesn't make a difference) + adblocker so I never get traditional ads.
Another example though... I just watched a few videos about fake self help gurus and influencers. I now have 3/10 recommended videos about that. 2/10 a random meme (which do look funny, tbf) and then the rest are subs.
About 6 months ago I ran my own little unscientific experiment with Youtube, after I noticed them promoting a certain channel that I've never watched. Nothing political or anything like that. Just a run of the mill youtuber, currently very popular with the kids.
The first week or two I was pushing the "not interested" and "don't recommend channel" every time one of the videos would show up in my feed (mind you I've never actually clicked on one and watched it and I'm a firefox+ubo user). Those two buttons had no effect at all. So I tried flagging the videos for "unwanted content". No effect either. The videos would keep on appearing in my suggestions at various places for a whole month straight.
The theory being that the channel was being actively promoted by Youtube and was thus impossible to hide. Ties in with the removal of the 'dislike' button as I believe it had the same reasoning behind it: raising the value of their advertising service.
Yeah, flagging stuff or just clicking dislikes sometimes does not work at all. I feel this is due to popularity being more important than these dislike acts. Additionally Youtube algorithm also treats user `dislikes` like an engagement, so it may work against your common sense.
What I did find very useful some time ago was to actually `Block` whole YouTube channels (you do it from Channel's Info page by clicking on the Flag button). This is an ultimate way of getting rid of some content on your feed. I blocked pretty much all these 5-Minute Crafts, Blossoms, Bright Side and other crap-generating trolls. Best decision ever!
Why is everyone pretending that the maximal amount of ad revenue comes doesn’t come from showing you what you want to see?
Individual videos don’t pay to be in the recommended feed. There’s no opportunity to see an ad except when you watch a video. YouTube has a perfectly balanced incentive to show you more content you’ll watch.
Where is the exclusive choice between (what the user wants to watch) and (what makes more money)? The only thing I can imagine is that they would down-rank videos not cached in your region.
> Why is everyone pretending that the maximal amount of ad revenue comes doesn’t come from showing you what you want to see?
There's a very philosophical debate hidden between the words in that sentence.
But since Youtube sees dislike as a signal to boost a video, it spreads content-views to things that people don't like, but feel compelled to interact with in (what I assume is) the "Not now, I'm busy - someone is being wrong on the internet" kind of way.
You can say, that is what people _actually_ want, since that is what they end up spending time on. You can point to popular subreddits like r/idiotsincars or r/makesmybloodboil and say "see! people love being enraged", and I think it is a valid view on defining "want", though not one I subscribe to.
I think this is not what people want, and promoting this kind of behaviour will in the mid to long term scare off users. My understanding of Tik-Tok is that they promote a style of feel-good content, rather than rage-inducing content and they are beating both Facebook and Youtube, who've been promoting dislikes.
> But since Youtube sees dislike as a signal to boost a video, it spreads content-views to things that people don't like, but feel compelled to interact with in (what I assume is) the "Not now, I'm busy - someone is being wrong on the internet" kind of way.
I just don’t see YT having the same feedback loops as other places. You can stay on without being directly typing in comments like on fb or twitter.
I personally never see rage-inducing content on YouTube (nor other sites), nor do i see it on my partners, friends, parents accounts. I’m sure it’s there, but I think that people who get it are training their recommendations to show it, not being forced it. I get way more rageful content on TikTok ironically, which has a way weaker “dislike” signal and a stronger propensity to show you new things.
To the point, I understand the incentive someone like fb has to show content people don’t like but still engage with, but I’m not sure that the incentive exists in YouTube. Furthermore I think we discredit companies (who are made up of sentient and capable people) to recognize short term vs long term value and the importance of reputation.
I reject the notion that there’s someone in San Bruno pitching that they use the dislike button to show more of that video instead of a negative quality signal. No one I know in Silicon Valley who works in tech would make such a suggestion, and they’re no different than the YouTube team on a fundamental level.
Technically correctly, although there is a subtle nuance here.
What you want to see is not necessarily the same as what you will end up watching. Just because you dislike a video doesn't mean you wouldn't watch it.
Or put another way, you can tell YouTube what you want (to be perceived socially as wanting) to see, but the algorithm knows what you actually want to see.
> Or put another way, you can tell YouTube what you want (to be perceived socially as wanting) to see, but the algorithm knows what you actually want to see.
Except the algorithm doesn't know what I actually want to see. Nobody but YouTube knows what I tell YouTube I like or dislike, so there's not even a theoretical pressure to lie about by wants in order to present an appearance to others.
By definition, clicking on a a dislike button means you watched at least part of the video. It’s a signal that “I started watching this and it wasn’t a good match”.
That should send the message that it’s appealing from the title/thumbnail but a bad video. I don’t think the “socially be seen as wanting” plays too much here… because starting to watch already. You already sent a different signal.
So do I. As such, what we watch doesn't make much difference to their bottom line, so they're certainly not going to write a second, just-as-expensive-to-maintain recommender engine just for the three of us.
You are in a minority. Why do you think they built their recommendation system with you in mind at all? You've already paid, you no longer impact the fit of the short term revenue maximization objective function.
I suspect daily/weekly/monthly active users and minutes viewed are a metric they're interested in maximizing. That would be incentive to optimize the recommendation system for paid users too.
I think that, concerning paid users, what they would be interested in would just be daily/weekly/monthly paying users. As for minutes watched they'd be incentivized to reduce it as far as it is possible while still retaining a paying user. After all, they don't show more ads if a paying user watches more minutes of video. At least not yet.
Agree. This study is assuming the intent of the feature is to remove those types of videos. In reality the intent is to use it as one of many signals to make better ML recommendations. Oftentimes, there are also differences in revealed and stated preferences.
The only way to make YouTube even remotely tolerable is with liberal use of the "Not Interested" and "Don't Recommend Channel" functions.
I also set up separate YouTube viewing accounts by interest, so that I can clobber away everything not related to a specific interest in that particular account (eg. anything not sports related gets "Don't Recommend Channel"'d in my Sports YT account).
YouTube pretty much fights you tooth and nail the whole way, but eventually your blocking of irrelevant and garbage recommendations leaves them no choice but recommending things related to that interest.
The only way to make it tolerable IMO is to stop using its frontends, and relying on its recommendation system. Use alternative apps like NewPipe, Invidious, and mpv+yt-dlp, and curate your own viewing experience with a simple list of uploaded videos from your subscriptions. If you want to watch a video about a certain topic, search for it, and never watch it from YouTube itself. Just don't feed their algorithms, and don't rely on them.
I just use this Chrome extension[1] (though actually with Brave for the ad-blocking) and don't worry about recs at all. When I want to watch a YouTube video I go on YouTube and am given a list of recent uploads by my subscribed channels. I pick one, watch it, then usually close the tab because there are no recs on the screen to lead me on an endless rabbit hole.
I'm almost at this point as well. Using the Not Interested and other tips helps but it's not long before the stuff rolls in not remotely matching my interests.
I run https://fiction.live which has a similar recommendations function as youtube.
The key insight here is that dislikes are not the same thing as uninterested. People will dislike stories that they're interested in but think are poorly written or do not have the ending that they want it to have. If you use dislike to mean uninterested you are ruining the recommendations.
Rather you need to have a separate button that says do not recommend other content like this. That's what we do and it works out well.
The zeitgeist of our culture; of course you didn’t MEAN to be negative, here try again.
No, no; I most certainly did mean to suggest you should stop showing me this persons channel. Not because I hate the person but because I don’t want a shit “AI” that’s feeding me what Google makes the best margins on.
> feeding me what Google makes the best margins on.
Lol exactly, Google and I disagree on a fundamental level, they cannot be the ones to do the work to find what I like, which is why I've disappeared the entire right-hand column on yt altogether with in extension
What I find interesting is the duality of YouTube criticism. On the one hand, YouTube should respect people's like and dislikes to a letter, but also at the same time YouTube pushes people into dark echo chamber rabbit holes by only giving them what they want.
On the the one hand they need to stop accidentally taking down innocent videos, but also need to take down bad videos faster.
People don't realize that all these issues are two sides of the same coin. They complain on each side separately and don't understand why optimizing for one messes up the other.
YouTube likes and dislikes are now pretty similar to Facebook likes.
It's not there to help you, the user. It's there to make the content creator feel popular. That's why YouTube removed dislike numbers and Facebook never had it to begin with. It's mostly window dressing.
Placebo Button is a push-button or other control that appears to have functionality but has no physical effect when pressed. Such buttons can appear to work, by lighting up or otherwise reacting, which rewards the user by giving them an illusion of control. They are commonly placed in situations where it would have once been useful to have such a button but the system now operates automatically, such as a manual thermostat in a temperature-regulated office. Were the control removed entirely, some users would feel frustrated at the awareness they were not in control.
Examples include:
* Office thermostats
* Walk buttons at pedestrian crossings
* London Underground train door buttons
* Elevator door close buttons
* FCC do not call lists
I wasn't making a reference to American election fraud, but rather the notion that even in nominally democratic countries you are often voting for different sides of the same political structure with little fundamental change
The linked report [0] is dated today and uses data from the RegretsReporter extension [1] and additional surveys from December 2021, which is basically when youtube removed the dislike counter [2].
It's both suprising and not that suprising to see quantitatively how little disliking or telling youtube to not reccomend channels or content has an impact on the algorithm. They also rightfully point out how much of a pain it is to try to provide negative feedback to youtube's reccomendation system. It's always been a chore to remove the hot new trending clickbait from the reccomendations.
I sadly doubt that this report will change much, but it's good to finally have some statistical validation of what I've frustrating subjectively experienced.
I'm not surprised by this. I gave up on the YouTube Home page, etc. years ago and I only look at the Subscriptions page. I've just found new channels to subscribe to through word of mouth over the years and I have more than enough content to choose from.
Same, but deep down I know that the chronological subscriptions page will be taken away soon, and that’s when I’ll stop using youtube except for direct links to videos.
My recommendations have been filled with crazy conspiracy stuff for months just because I watched some caving/diving disaster videos styled like camp fire stories. I have tried marking videos as not interesting and blocking channels, but there appears to be an endless supply of that stuff.
To make it worse it looks like accidentally letting the preview play a few seconds of a video marks the subject as something you are interested in and it can sometimes be near impossible to select a video in the middle of the page without having one of the surrounding videos start a preview.
I want good youtubers to be able to make a living and I pay for the YouTube subscription, but this crap is driving me towards setting up Invidious rapidly.
I've found that the algorithm can be extremely volatile for some things and barely react to others.
I watch half of one video by a 'pop' youtuber like MrBeast or LazarBeam and that's the only recommendations I get for a week.
Likewise, I can spend hours and hours listening to more passive content like music or long-form documentary style videos and I barely see any bump in that kind of content recommended for me.
The most effective tweaking methods to the algorithm I've found is just subscribing to people who cover a genre I want to see more of and then watching videos from their videos recommended section.
I'm perpetually confused by YouTubes weights for choosing recommended videos.
Ann Reardon (not Reed, as I wrote originally), from "How to cook that" did a video on it.
Basically Youtube rewards interactions, and a dislike is an interaction. Her main evidence was a video by pewdiepie where he asks everyone who views it to dislike it. It is one of his most viewed videos, at least around that time, and by far the most disliked.
Sadly Ann Reardon names her videos after what she cooks or debunks, and not her critiques on the algorithm (she has quite a few really good one)
I attended a conference a few years back where a lead ML/data scientist at a big company you’ve heard of and probably use said the dislike button was ignored in favor of this: did you watch and engage with the content.
This lines up with my experience and continued frustration with YouTube. I use the service a lot whether it be for background noise, research, and so on. The recommendations have been awful (doubly so in Shorts) regardless of how many videos I dislike or categories of channels I choose to stop recommendations of.
My only question then is: how do I know if I dislike a video if I haven't watched it? Sometimes I can leave a video early, but others I will have to watch for a bit before realizing it is a waste of time. This is more pronounced with Shorts due to the duration of videos. I will end up watching something that is 20 seconds long to see what it was about (you never know when you will find a gem), but then dislike it once it is finished. YouTube just keeps recommending low quality garbage until I get fed up with it and spam "Do not recommend channel" the moment a video shows up.
I noticed that when youtube implemented the auto-play feature on the homepage, whereby you hover the mouse over a vid, it expands slightly and starts to autoplay the vids, that it actually puts the vids in to your watch history, even if you only watched a few (2-3) seconds. After this i noticed that my recommendations got crappier almost instantly. Basically, I would see something with a 'nice' tumbnail and hover it. Since then, everytime I open youtube, i need to go into the preferences and disable that shit, so not to pollute my history, and thus pollute my recommendations. I clear cookies when browser closes and this setting doesnt get saved, for some reason, google isnt smart enough to save these settings / preferences to my account like all the other setting, oh no, its controlled by a cookie.
This may have already been scailed back abit since I first noticed it, so maybe not 100% accuarate anymore, still pisses me off.
I dont see it mentioned BlockTube extension [0] anywhere in the comments yet. It automatically blocks videos/channels based on your defined keywords and it works perfectly. For some period of time my main YT page was filled with a variation of videos about "walking in Tokyo". I'm not interested in walking in cities nor Japan, so I have no idea where that came from. No matter how many of these videos I marked as "not interested", more of them kept coming for days. Blocktube just automatically purged them for me. Sometime I saw just a blank page (because all the videos were blocked automatically), so I had to reload page to see at least something. It took some time to get rid off the "noise", but my YT experience is now way better.
They left out the block channel button. Very effective.
Seriously, as an avid YouTube consumer (and not a creator) training my algorithm is such a huge part of what my entertainment life is. Definitely have used private mode, on a VPN, with a different browser to watch videos I didn’t want polluting my algorithm. Now that my kid knows how to use YouTube and I let him use my account, my years of work are totally shot… maybe.
It’s nice to be able to see and block channels, guide the algorithm away from YTP, and be sure to keep plenty of math and science options on the recommendations.
I’ve also noticed that it varies by time of day.
I wouldn’t mind more control, but as it is, I learn the system and do have control that I need. I don’t always know what I want, and do like surprises. Like, who knew I’d be into cow hoof trimming? YouTube did.
I'd be curious to know how many additional "wanted" recommendations the system is able to serve by not treating a dislike as a strong negative signal. It seems like there's a tradeoff here, where YouTube figures that if it interpreted a dislike with a heavy hand, it would also knock a bunch of videos you do want (or at least they think you want).
Presumably from YouTube's perspective, it's more important that your recommendations pane contain at least one video that you want to watch than it is that it contains zero videos you don't want to see. The user's desires are a probably a little more balanced - seeing videos on topics you dislike is annoying even if you don't watch them, and even if there is a single good video mixed in.
Except that when the recommendations are 90% garbage, I'm not going to sift through them to find the occasional gem. That's just burning time for little reward.
Facebook, YouTube and many others are user hostile. They make money by controlling the information you see. Your attention is a commodity they sell. Both of these services... give you what you want from time to time just to keep you on the service.. then they maximize trying to divert your attention to what they want (or are paid to show you).
From one angle its quite dystopian to watch people endlessly scroll or just go from video to video on youtube.... essentially an algorithm is influencing your behavior.. perhaps programming you to an extent. Super creepy to see people everywhere just sit and stare at the device while this happens. Its like there is an illusion of user control.
There's still a dislike button, it just doesn't show the like to dislike ratio, so it's barely worth pressing. The outcry at the time seemed to indicate that this is explicitly to appease bigger media companies, who put out crap and don't care what anyone thinks (but do care about what people think other people think).
What pisses me off, though, is that if I get a video recommended, I can ignore it for days, sometimes over a week, and it stays in the recommendations unless I "Not Interested" it.
After I've chosen...lots of other videos, maybe it's time to stop recommending that one. Often it's a video I've already seen, and enjoyed. I hate doing "not interested" to those.
I also hate doing the work of it all. I would blame the algo or whatever, but I actually think the algo is working exactly how Google wants it to.
Long ago, tech giant CEO's realised that recommendation systems are very powerful. They can trick hundreds of millions of users into spending double digit percentages of their life watching memes.
From that point on, focus was reduced on working on recommenders. Instead the goal was not to trick users into spending their whole life on the platform, but to maximize ad revenue from those users in the limited time they did spend on the platform.
The one party that wasn't part of this was tiktok. And that's why they have stolen so much user share.
I find both TikTok's and Youtube's algorithms show me highly relevant and interesting content. The idea that it only shows you memes seems to come from people who a) dont actually use these services more deeply than surface level interactions or b) they are actually watching low-quality memes (meaning finishing more than 75% of the videos) so Youtube/TikTok is simply showing them what they previously indicated they are interested in.
When I bought Youtube premium I started using it much more seriously (meaning curating subscriptions, marking things as "Not Interested", and actually following up on my notification feed when new videos get released) and I've been amazed at the level of quality it delivers. It learns exactly what you like and dislike.
And no it doesnt force feed me politics or ragebait like Reddit and Twitter - both of which do in spite of my efforts to curate them.
TikTok took a bit of time investment at first but now it shows me HN-style tech stuff, highly local restaurant recommendations, nerdy film stuff, etc. I never see dancing teenagers and I don't think I ever saw them after the first day I used TikTok.
Doesn't seem popular because it's autonomous and scary and all but youtube should just move towards the tiktok system of algorithmic recommendation purely based on what people watch. How people vote with their feet will always be the strongest genuine indicator of what they actually want.
Likes and Dislikes have countless of disadvantages. They're subject to review bombing, heavily skew towards power users because normal people don't even care to hit any buttons and it's a purely binary choice.
Ove been thinking about this problem with Facebook's recommendations: While previous "dislikes" are taken into account when predicting of a user will interact with a post, the system is still fundementally trying to predict "click through rate" not whether the user enjoys a post or not. It might even be that disliking something is positively correlated with clicking it, since they may have watched the video before deciding it is no good.
I'm a bit surprised at everyone in the comments going "YouTube isn't great with their suggestions/thumbs down prediction etc"
Why do you expect Google (or any trillion dollar company) to care how good or bad the thumbs down or similar actions work? Google only cares about revenue and profits, for which it doesn't matter if the video is an educational one, a game being streamed or content designed to outrage people or spread hate. Did you watch the ad? Good. Google is happy. And google doesn't care.
"But I don't watch those clickbaity videos! Google's losing revenue on me!" Yeah well apparently not on 95 or 99% of the viewers apparently for them to continue acting like this :shrug:
Unless the incentives of Google and the viewers align (or regulation/competition forces Google), don't expect the thumbs down button to be more than a "Please Sir, I want some less" button-equivalent.
Every other day YouTube seems to want to give me another American local news channel, despite the fact I'm in Australia. I've marked them as "don't recommend" as I've seen them, but I'm still getting a lot of results for Fox Local Clapistan 7 all the time which just aren't relevant to me at all.
Especially with their new vertical and short format, I feel like downvoting does nothing. I get hooked in by something that looks interesting, and then get served reaction videos that I hate, people stealing content from others and assembling it into "5 hacks you didn't know", beginner mountain bikers going slow on a flat run, and for whatever reason garage workers complaining that people tell them they don't know their jobs but yeah they know their jobs. I'm scared to watch these things because they last 10s each so by the time I realize what it is and downvote/switch I'm already halfway through and I assume youtube considers this as a success and reinforces my stream with that garbage. PITP is that I still get the same content despite downvoting
Personally, I found the homepage recommendations too sticky after pruning. I have other things to do than watch videos, but find it difficult to stop.
I've cut back to discovering content only via other sources (HN, other communities, friends) by blocking the homepage and sidebar recommendations entirely.
Plug this into your uBlock "My Filters" and YT should become far less addictive. It blocks both direct homepage access and SPA routing (on the latter the homepage will just be blank):
My main problem with YouTube recommendations is that the majority of my feed is videos I've already watched. I generally get quickly tired of waiting for the algorithm to recommend something interesting.
Although occasionally on the mobile site it inhects a "want to see something new?" into the feed which is a godsend. That list of recommendations is beautiful and interesting. However it doesn't seem to have any repeatable way to access it.
Although the poor recommendations may be partly because I manage my subscriptions via RSS so most of my watching is direct-to-video links then leaving the site. But that doesn't seem to excuse recommending things that I have already watched.
I hope we will get a good alternative to YouTube. The amount of negative feedback from its audience has been increasing at a very fast pace, and Google doesn't seem to care at all.
At this point, I'm thinking they are very deliberately choosing to take this path. For what reason? I don't know. But it seems like Google really don't want people to discover any good content and force users to watch a very select collection of videos.
My circle and friends are basically all watching similar content to what I'm watching, and I have almost no common interests with some of them..
You're supposed to click in video in recommendedation in the three dot menu and click either "not interested" or "do not recommend channel"... for some fucking idiotic reason.
Other fact about downvotes is that they count as engagement so downvoting a video promotes it more than not clicking anything at all, that is why many youtubers go "if you don't like it click the downvote and write about that in the comment"... because both of those actions drive "engagement" so video is promoted more
What a lot of people are missing is that when watching YouTube, they're the product, not the consumer.
I know this has been said many times before, but ignoring this reality is like trying to understand life without considering evolution, or understanding chemistry while ignoring the periodic table.
Google's incentives are to tailor customers to advertisers, not customers to content!
They don't care if you see what you want to see, they care if you see what they want you to see.
How does that at all relate to YouTube videos though?
YouTube wants you to see ads. Ads are put in front of YouTube videos. Therefore youtube wants you to watch videos. YouTube, logically, should want you to watch more videos, for longer. If they serve you bad videos, you stop.
Therefore, Googles incentives are to tailor good recommendations of content for customers.
QED.
I think all this “you’re not the customer you’re the product” signaling - which is true to an extent- needs an additional corollary of “products aren’t free to acquire or produce” or “businesses can’t exhaust their supply of products to sell”.
Because if YouTube truly doesn’t care about serving you relevant videos -as you conjecture-, why not recommend the same 100 videos to everyone? It’s make caching way cheaper.
Actually this is false. It’s a bit counter-intuitive, I know, but it has long been known in the field of psychology that a variable reward system works significantly better than consistent rewards or consistent punishment. The reality is that a mixture of bad, mediocre, and occasionally good recommendations is the most likely to keep you on the site for the longest.
They are sort of recommending 100 videos to everyone.
In my circle, a lot of people I know have very different interests than me. Yet, whenever I recommend a video to a friend, they often say it was already in their recommendations.
They might be pushing videos or topics that advertisers are more willing to pay for.
> a lot of people I know have very different interests than me. Yet, whenever I recommend a video to a friend, they often say it was already in their recommendations.
Is it possible that Youtube knows your friend's video interests just like you do?
> They might be pushing videos or topics that advertisers are more willing to pay for.
How does this fit if I pay for YouTube Premium? I don’t see the ads, and I guess they still get the data. Why are they trying to drive me to particular content if they can’t show me the ads they want to?
That is why for the last couple of years I have been only using Invidious/Piped (on desktop) and NewPipe (on Android).
I have my subscriptions there and only new videos of the channels I'm interested into appear. No distraction, no trending low quality videos in my feed anymore. I only get what I'm interested into.
When I want to look for something new, I head over the popular/trending sections and maybe navigate a bit on vanilla YouTube to look for new things.
Judging by the ads I have been getting lately, Youtube must somehow think I'm a Japanese person who moved to middle America, because I'm getting ads for local businesses I couldn't use if I wasn't living there, like pet grooming centers. Unless of course, there's some malcompetence at play, and youtube is deliberately allowing for the waste of advertising budgets of middle American businesses on people like me, to sabotage them.
A thing that I noticed is that Youtube recommends me completely different video's when I'm at work (mostly music) then when at my desktop at home (complete mix of interests). And different video's when i'm on my phone (documentaries).
It seems the system is coupled with your device and time of day.
Funny thing is, I never get recommended music on my desktop and at work I never get documentaries recommended.
That was my experience years ago when I last did so. Logging in offered no visible upsides and multiple downs.
I learned not to log in and use search wherever possible.
Now I use Invidious (with JS disabled and/or uMatrix blocking YouTube functionality in the browser as an additional nudge), or youtube-dl / ytdl / mpv to play video directly. Usually audio-only for talkpieces.
I'll also disable recommendations and comments portions of the website, for additional bonus (via Stylus).
Theres another script that deletes watched videos from the specific youtube plex library every night.
It works great as long as you remember to remove the videos from your watched later playlist before watching them on plex. (Have yet to find out a simple way to automate playlist removal)
It was (well, still is) a terminal-based YouTube client which provided search, locally-managed playlists, choice of playback mode (video or audio-only), downloads, and a number of other features. YouTube quashed it with an API-key limitations such that it is now effectively unusable.
But for a brief moment, YouTube content was accessible from a terminal window or console, without all the godawful dark patterns of the site itself. That was tremendously useful.
These days I'll snag videos by URL via youtube-dl/ytdl, and/or mpv (which invokes those transparently). Not quite as magickal, but still pretty good.
Hrm... Apparently there's a successor project, yewtube:
If you actually subscribe to channels you like when logged in, you'll get much better recommendations. I have over a 100. (And no, all of the recommendations aren't from my subscribed channels, it's a mix of them which have newer videos and similar videos to my subscriptions.)
That references a G+ post which is of course dead now with that service. Wayback claims to have captured the post but I can't find a version that renders properly (or at all):
(The HN submission referenced a post of my own on G+, which I believe I deleted shortly afterward.)
When I did utilise YT under a YouTube (and for a very brief time, Google+) account, I found the benefits de minimis. And in particular, what I'd most prefer would be to dismiss channels (or entire genres) with extreme prejudice, which simply was not available. And I'm not going to play rat to the Borg's Skinner box to find the reward morsel. I've got methods that work with ample sufficiency for me now.
I absolutely do not agree. I have some kind of legacy "youtube" account, which is not a google account. When I occasionally get switched to my Google account, which does not have my YT history of views/likes/etc, I get tremendously bad recommendations. Just the biggest youtubers with the most click-baity titles, and topics like celebrities, which I actively hate.
Also I find it gives me a ton of recommendations for videos I've already seen.
Best YT experience is to clear cookies then open a lot of YT links from HN, but when our human nature drives us watching a stupid video or two from YT recommendation then after that session do clearing cookies and open YT links from HN again.
Using YT via Incognito or Firefox containers, with frequently-cleared cookies, and using search as a primary discovery method ... works OK... It's not great, but it's not terrible.
The recency bias means that whatever terms you're searching become prevalent for that session. It's not possible to dismiss items or channels (idiotic on YT's part IMO), but you can at least create a short-lived preference instance unpolluted by any previous explorations.
You can delete individual videos from your watch history in the history tab to remove them from recommendation basis. Unlike the “Don’t show me videos like this” button, it works well.
It was never about what you want to see. It’s about what they want you to see.
We will just end up back full circle at TV, with the only thing our control being the ability to flip between channels.
Seriously, fuck everyone who implements that “Relevant” sorting in searches on every platform, which is usually everything but what you are actually searching for.
I am pretty sure it is working as intended. And the intention is not for you to customize your feed, it is for youtube to customize it for better retention in average. In dislike's case, you are marking the video as controversial which is a good metric for the video and youtube pushes it more
I recently watched a craft video of a guy that was a pretty egocentric douche, at one point he even mentioned how people get offended by him and dislike his videos (not only via the dislike button, but also that) and he mentioned that it doesn't matter. Apparently he was right.
"repulsion" in a hyperbolic space (especially a higher dimension one) has basically no power to guide selection. That's what a "dislike" is, a repulsion in a space where there is a lot more room to slide around in than most folks intuition allows for.
Am I the only one that never thought the like/dislike buttons were tied to recommendations? That it has any effect at all surprises me. They used to be star ratings and were always meant to indicate to the video creator and other users the quality of the video.
I wrote an app called SmartKid that attempts to curate your YouTube recommendations via disliking videos that match a set of keywords (e.g. Minecraft, mrbeast, reaction).
Does anyone else just not use youtube anymore? It's not even worth fighting its annoying systems. There might be some content I want to see on there, but it's going to go unseen by me. Sorry video creators. It's probably time for a new site.
Self-reporting is strongly subject to bias. YouTube/Google have long been non-credible in reporting on their own activities. I've had some experience with this myself, as a third-party researcher:
Improved recommendations would be nice, but the ability to exclude specific channels from ever appearing in recommendations or search results would be fantastic. The amount of litter on YT is extraordinary.
When you browse for content in a certain field, it will often refer you to content in the same field, which makes me feel trapped in a small circle and not exposed to other diverse elements.
In practice it doesn't work in the short term but it should work in the long term because Google processes so much data with billions of data points so their algorithms need some time to catch up. Sad but true. The same situation is with search ranks; they update their ranking index every few months or so according to billions of data points they gather with the final word of ranking algorithms.
Basically every big internet service runs on algorithms. You name it: Ebay, Amazon, Google, Facebook, TikTok, Netflix, Spotify etc.
The problem with Youtube's recommendations is that they are not organic. Producing ad revenue is its main goal. This makes YT fundamentally different than a decade ago, when the algorithm just wanted to show you some genuinely interesting stuff.
Making Youtube better from Google's point of view inevitably makes it worse from a user's point of view. It would take a really bad quarter to tune the algorithm and make it more user centric. And that's not going to happen any time soon.
This concentration of videos around recent topics and YouTube’s circling of that content seems to be one of the reasons so much disinformation and misinformation gets pushed on the platform. It’s not just an inconvenience.
If I watch a video from one or the other side of the political spectrum, or watch something more fringe just because I want to learn about the subject, I get nonstop recommendations for more videos and more videos and more and more and more of the same, nonstop forever. I can’t escape the subject without wiping the account, and while there does seem to be an effect from block channel dislike seems to do little.
I feel like I’m staring into a bathtub full of water in the middle of the ocean. YouTube needs to fix their discovery tools because I think it’s causing someone who goes down a rabbit hole to keep going down that rabbit hole with ease, and that’s not really that cool with radicalizing content.
Just my 2c from recent experience. Love the platform (except for the increase in ads to cable levels recently) otherwise.
If we expand the argument a bit, we might say that we're okay with some bathtubs but not others. If there is a slight chill or the water is ice cold, we feel trapped, but if we like the temperature we're fine not plunging into the ocean.
It's true that we can't avoid having our thoughts affected by our environment but it feels weird to know that so much of our mental landscape is determined by alogrithms or persons that are not connected to the actual material or social fabric of our lives.
Recently I saw a video on my youtube feed claiming to be a "Live" crypto discussion with 4 people including "Elon Musk". It looked pretty legitimate so I checked it out. It was obviously pre-recorded and re-streamed and a scam. So I flagged the video and 4-5 hrs later got a notification that it has been removed. The next day the same video appeared. I flagged it again and it was removed 4-5hrs later (note there was 50k+ viewing the live stream). The next day it appeared again. I stopped bothering to report it.
Does youtube not have a responsibility to protect viewers from fraud? Surely they can detect the same video being posted again?
I wonder how many people are losing money to these scams.
I wish I could dislike entire genres of things, specifically reaction videos. Laziest “content” ever and they completely takeover results anytime I search for music.
Not driving my car today has no effect on global warming a to my house. Temperatures today were barely changed from yesterday, according to researchers at Mozilla.
I've actually hit the end of my recommendation stream on the mobile app after marking dozens of videos with Not Interested. Once this happens it starts recommending videos you've already watched or older videos from creators you've been watching lately.
Personally, I'd love it if you could curate the recommendation system a bit more. I'm never interested in rewatching stuff I've already seen. And the recency bias for recommendations means you're always getting more if what you've recently consumed. I'd like it if more older interests were mixed in. Finally, I'd love to be able to blacklist certain topics and games entirely.
There's so much missed potential with YouTube recommendations.
Some might say that you can still use search, but that has been crippled to include unrelated recommendations after one page of results. Not only that, I want YouTube to help me discover new content. When the recommendation system works and it shows me new kinds of interesting stuff, it's superb. TikTok is eating YouTube's lunch in this regard.
Edit: forgot to mention this. On mobile if you do a recommendation refresh it'll show a New to You filter which you can use to explore more diverse recommendations. I don't know why it's hidden in this manner.