It's only ever resolved favorably if you have enough clout.
I understand there simply not being enough manpower on Earth to manually review so many cases, but the balance seems to be so heavily skewed towards automation (which still does a very piss poor job at it) that just feels like there's no human element vetting the final decision at all, and that's the approach which so many people have gripes about, rightfully so until Google strikes that balance.
> I understand there simply not being enough manpower on Earth to manually review so many cases
IMHO it's less about man power then how to approach thinks.
Like for example in the given case the video was reviewed by people multiple times. So they could have just "locked" it into being a good video and on put higher bars on further complains.
I mean this video got taken down even through it had been manually reviewed. So not having enough manpower isn't quite the problem.
To problem is more on the line of:
- YouTube doesn't incur much cost on false take downs (both on copyright and community standards). Even false closure of whole google accounts often doesn't cost them much even through it can cause major damages.
- There is no good law to protect small content creators against false copyright claims, false "bad content" claims or discrimination through non advisability in most countries. Sure laws exist in some places but they are mostly in-effective.
> should a publisher be allowed to run a platform that they are unable to curate?
That one is tricky.
My take on it is if they do content dependent advertisement, user behaviour based recommendations and similar then no. They should not.
If they don't have this I'm not sure.
If are based on content which isn't widely publicly available but mostly 1-to-1 or 1-to-n (with a smallish n), then I would argue they should _not be allowed to curate content_. I mean how absurd would it be if Google would delete a file which is not public visible from Drive because it seems inappropriate (yes, sarcasm). What's next they delete files from your android phone or computer when they are deemed inappropriate?!
> should a publisher be allowed to run a platform that they are unable to curate?
It should. If these services didn't exist people (more exactly, a fraction thereof) would be hosting the videos themselves, and we'd have even less of a curation. It would be perverse to penalize Google for attempting some moderation rather than just working as a hosting service, which is exactly why section 230 immunizes this kind of imperfect moderation.
With a profit of more than 10 billion per quarter, I find it hard to believe that Google can't hire more people to handle cases like this. It's not a manpower problem.
And 2) is a really interesting question. The most obvious answer is if you can't manage (curate) a platform, you shouldn't run one.
> 2) should a publisher be allowed to run a platform that they are unable to curate?
Absolutely. I’d prefer to see it go back to the old days - where there was no moderation and they had safe harbor.
Of course we shouldn’t permit illegal content but everything else absolutely should be fair game. If you don’t like it then don’t watch it. Someone else might.
Why is there no trust factor established? I mean if you have been posting without any hiccups for many years, you should be protected from trolls like this.
It is common to open accounts and make them appear relatively benign only to pivot to malicious use after some time on the platform. Not to mention the secondary market where people sell aged accounts to bad actors.
>I understand there simply not being enough manpower on Earth to manually review so many cases
It's an unpopular opinion, but perhaps there doesn't need to be so many videos online for public viewing.
I don't think its much to ask for a real human to review every video before allowing it to be made public. The posting person could perhaps pay a small fee for the video to be reviewed.
We could have more relaxed rules for videos that are private and shared among a small group of other accounts.
Why do we feel entitled to post any video for public consumption for free?
Actually, now that I think about it, its kind of surprising that nations don't require a film and television style rating for every video.
I don't really want a video site. What I want is a world where people can be held accountable for the things they post publicly on the internet, whether that's YouTube, Facebook, Twitter or any other medium.
Is that true? I mean, you can also argue it's only ever an issue in the first place if you have enough clout to be targeted. And I haven't seen any proof that people "without clout" aren't able to get it resolved, have you? Even this person isn't exactly a YouTube heavy-hitter, it's a relatively niche channel.
I don't know about YouTube, but I know for a fact some kinds of bans on Facebook can only be reversed if you have a friend inside Facebook that advocates for you and contacts the relevant team. Source: I know because I have such a friend, and only going through him and not through the official appeal channels got stuff unbanned (not mine: I was in turn advocating for friends. Not that it matters). The official channels are often so slow and ineffective -- and often inscrutable -- as to be useless.
I had a customer two years ago that we needed to rename her professional page from "Person's Name Service" to "Person's Name". That's all, we didn't change the named person, or the content of the page, just removed the service because she did more than that now. I got denied THREE times by bots claiming the change was too severe and changing the "topic" of the page, which would be potentially confusing to users, or worse, akin to changing the page to something they wouldn't otherwise support ("Local Girl Scout Troupe 4" suddenly becomes "Fans of the KKK").
I had to contact someone I know at FB to get a human to review the change. My friend and the reviewer felt the change was completely legit and allowed it.
What do you do when you don't have someone on the inside?
I think there's an extent where "clout" (depending on how you define that) makes sense.
Let's take Joe Rogan as an example, ignoring for now that he's going to Spotify next year. I'd imagine that what happened to this small YouTuber would be impossible to happen to Joe Rogan. He likely has the cell phone number of several people at YouTube. It could even be the case that his account is flagged in a way that only a high-level staffer can take certain actions. So many people are watching him that it'd be a huge problem for YouTube's audience if something were to happen to his channel in error.
It's good service to the YouTube community to give the most popular folks outsized attention. Not even talking PR here, though I'm sure that's just as powerful a force. Just in terms of "how quickly would the audience be affected." Obviously, there are limits to this and it would seriously hurt YouTube's brand if they were known as being bad for small creators.
I don't think this is true. I expect it gets it right 99.9% of the time, but in this case someone is against a determined blackmailer who can put a lot of effort into exploiting that 0.1%. The automation will win for a while, but then the attacks will evolve.
I understand there simply not being enough manpower on Earth to manually review so many cases, but the balance seems to be so heavily skewed towards automation (which still does a very piss poor job at it) that just feels like there's no human element vetting the final decision at all, and that's the approach which so many people have gripes about, rightfully so until Google strikes that balance.