Hacker News new | past | comments | ask | show | jobs | submit login

Personalization makes it incredibly hard to "watch the watchers," because everyone is getting a slightly different view of what Google is doing. I would like to see a program where users submitted data about their recommendations to researchers so that we could uncover Google's opinions. It would have a lot of financial value to YouTubers and would make it harder for Google to abuse their role as censor.

I could imagine shadow-burning YouTubers without banning them by shrinking their recommendation audience.

Further, it would be good for Google. Every little shift in the weather is going to get blamed on them whether they deserve it or not, now that it's common knowledge that they weild this power in more than zero cases. Google is about to discover why judges write opinions. Administering justice from secret meetings leads to popular dissent more than it leads to justice.




I clear my YouTube search and watch history about once a week. Partly because of privacy, but also because a single binge of, say, metal casting videos does not mean I want them recommend ever again in the future.


One thing this opens you up to is the average recommendation, which is biased towards inflammatory or click-bait content.


I do it every week as well and the recommendations start off being seeded from my subscriptions, not from the trending tab.


They use browser fingerprinting and/or IP addresses as well. Even on a brand new browser session, doing something even slightly related to the previous session brings back its entire recommendation history.


> doing something even slightly related to the previous session brings back its entire recommendation history.

Are you sure that it does, and that it's not just a case of "hey we've never seen this person before, but they watched X, let's immediately start with recommendations Y and Z because other people who watched X were engaged with it?"


I used to think that and gave them the benefit of the doubt, but then I realised that some of the videos being recommended had nothing to do with the one currently watched other than the fact I watched similar ones previously.


Yeah, the fact that I can clear cookies, open a private browsing window with tracking protection turned on, go to YouTube, and be asked which of my two gmail accounts I want to log in with, is pretty creepy.


Yes you are right they are using browser fingerprinting, but with the right tool they can easily change it. They can use Kameleo software or Multilogin etc. https://kameleo.io/


How does that work at, say, the library, or some other public internet kiosk? Or maybe they're just assuming these are edge cases today, amidst the billions of private browser instances on handhelds?


It probably doesn’t work, considering the library users would be watching totally different & unrelated videos, where as in my case they have years of data of very specific viewing habits centred around a few topics.


The problem with this is that it then just suggests the content that hits the front page instead - so for the most part a load of crap. All I'm doing is swapping recommended videos on, say, metal casting, for YouTube's "on-brand" content creators which pump out generic content on a bi-daily basis at 10 minutes in length.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: