Hacker News new | past | comments | ask | show | jobs | submit login

Is it even possible to scale a human operation to this level, even with youtube's checkbook?

It's not that AI needs to replace empathy alone, it needs to replace culturally and legally nuanced judgement about copyrighted works, among other things. That's hard to teach fast to a mechturk type workforce, and staffing a team of people competent enough to understand that nuance at youtube's scale has to be prohibitively expensive if it's even possible.

Humanizing operations buys quality at the expense of putting a upper bound on scale. Few companies are willing to accept that ceiling for fear of losing market share.




No but they should be putting more humans towards their larger creators. Anyone with over 100k subs shouldn’t have to beg on social media just for YouTube to take action. Some creators have direct contacts but even those contacts are limited in what they can do.


Youtube could also offer a fee-based genuine review process for epople who care about their channel enough to get their pocketbook out. And to make it even better, if Youtube is as sincere in their claims of "doing their best", make the entire negotiation public, so we can start to finally see the actual nature of what's going on, rather than rely upon unconfirmable promises and pure speculation, as is the current state of affairs.

Do they have to do this? Of course not. Being a private company, they can do whatever they want.

However, by not doing things like this:

- they are constantly losing public support and trust

- they are setting a track record of evasive and non-transparent behavior, that could be used in potential future government legislation

- they run the risk of a cashed up honest/transparent competitor suddenly bringing a new, honest platform to market, catching them with their pants down both technically and ~morally (in the high ground sense of the word). This may sound impossible with the scale of YouTube, but think about this: what if someone brought a platform that only publishes (and maintains) quality (for some definition of the word) content? Like a YouTube for ~serious content? This way, they could leave the expensive hosting of nonsense to YouTube, and cherry pick the quality content, upon which a future self-sustaining business model will likely be very dependent. I think there are probably a number of people in China who have both the money as well as understanding of human nature (including deceitful abuse of legislation and limitations of human intelligence) necessary to make this happen. Whether it's worth the risk I have no idea, but some day user-produced content is going to be a huge cash cow, not to mention the propaganda power inherent in owning the #1 platform.


I agree, but the fee should be a deposit placed by both parties willing to have a review, with the loosing side paying. Direct payments for reviews would create bad incentives to not improve their AI.

Or strait up go for external arbitrators. But "I'm willing to put money behind my words" seems like an easy signal against abusive mass claims.


I really like that idea. Even if you only make both sides put up $5 each, loser pays, that would add up a lot for automated bots making fraudulent claims and hopefully not too much for people being claimed against.

If a single human reviewer takes 15 minutes on average to process each claim, then that's $20 an hour coming in per reviewer, which could make it viable.

Hopefully that would have the dual effect - firstly making it too expensive for bots to make thousands of fraudulent claims, massively reducing the volume of claims, and at the same time having humans hopefully making fair decisions instead of the AI just assuming the claimant is correct.


That makes some sense. I'm not sure where I stand on it.

On the one hand, it seems a bit unjust that smaller creators should be snuffed out for things that wash off the back of established players. A two tiered justice system isn't just.

On the other hand, it's a reasonable (and more cost effective) way to do more than nothing.

A bit gross, but it would probably work in YT's favor to suppress cries of injustice early. Would this even be a scandal if it only hit small creators?


> A two tiered justice system isn't just.

Luckily, this isn't the real legal system.

The way I look at it, it's really about the users affected, rather than creators. A creator with 100k fans has more people depending on them than a creator with 100 fans. It makes sense that Youtube would be more careful with the former than the latter, because of the customer impact.


I agree, but fortunately the smaller channels rarely have issues. People go after the larger creators because of the deep pockets and such. Better to start somewhere, at least.


They should be putting humans towards problems that large creators have but not specifically towards the large creators. That way they can help everyone. Those people would naturally be dealing with the large creators more often just because they are heavier users.


That's easy to say and sounds good, but doesn't solve the scaling problem. Every video creator wants some human judgement as to whether their background audio is fair use or not before the video is demonetized. If you give that treatment to every video creator, you are soon going to be employing half the population of the planet as copyright judges


No you use the AI to help the human. And you use the human to spot check the AI. It’s totally scalable.


So 3 years ago, there were 24,000 channels with more than 100k subscribers, and more than 40 channels per day were hitting that mark. Even with over a million subscribers, channels make less than $20,000 a year.

There are simply too many people, and they don't make enough even on the big channels to justify that sort of individual attention. In fact, the only way they can make ANY money is to have so little human attention.

https://socialblade.com/blog/youtube-milestones-four-channel...

https://www.inc.com/minda-zetlin/even-youtube-stars-with-14-...


If they hired just ten people to do 15 minute manual reviews for >100k channels, that's 250 reviews a weekday and more than enough to take care of all the biggest problems affecting them.

It gets actually pathetic when it comes to channels with a million subs. Sure, some of them don't make a ton of money, but there's only a few thousand total. One small team could do so much. If a channel makes $20k a year, with a significant fraction going to youtube, why can't it get $50 of someone's time?


>Few companies are willing to accept that ceiling for fear of losing market share.

The question I find interesting is: is this assumption correct? Is service scale the dominant factor in market share, or could you win a significant piece of the pie with a "Smaller YouTube by Humans"?

In theory, a more human touch could attract content creators, who in turn could bring their audience. Whether that would translate to significant market share, though, dunno.


> The question I find interesting is: is this assumption correct? Is service scale the dominant factor in market share, or could you win a significant piece of the pie with a "Smaller YouTube by Humans"?

At least so far, the answer to your second question has been "no," at least if you define "significant" as "that which attracts significant VC investment." (I'm not suggesting that's the best metric, but it's what an awful lot of tech companies use in practice.) This is because the answer to your first question is demonstrably yes: when it comes to services that heavily rely on network effects, service scale is the dominant factor in market share.

It's possible that you could carve out a sustainable niche if you really leaned into the "by humans" part, but it'd still be a niche. I don't think that's necessarily a reason to avoid it.


This already exists. For example there's stream.cz, content creators are curated and may get some help with creating content if it's good.

There are other similar online "TVs", like https://www.mall.tv/.


> possible to scale a human operation to this level

Here is one stat I found from 2019:

- 400 hours of video are uploaded to YouTube every minute

So if videos were reviewed at 1x speed, they would need 24000 humans working 24x7. (Not sure how to scale that to actual jobs, but seems manageable)

Related: "Facebook had more than 30,000 employees working on safety and security — about half of whom were content moderators." Not a fun job:

https://www.theverge.com/2019/2/25/18229714/cognizant-facebo...


Videos at 1x speed seems to be a very naive approach. Only viewing videos detected/reported with copyright claim would be a more sensible approach, as that is where nearly all of youtube's issues lie. Even putting a few hundred people manning disputes from large customers would be hundreds of times better.


I don't think that stat is relevant. This is about reviewing only those videos that face copyright claims.


Ah, that seems trivial in comparison.


For Facebook, they mention a contractor salary of $28800/yr. Multiplying by 30k is just under $1 billion/yr (assuming no overhead).


>Is it even possible to scale a human operation to this level, even with youtube's checkbook?

YT probably doesn't make money. We don't know because Alphabet doesn't put it as a separate line item in their earnings reports. Insiders say it about "breaks even".


Can you explain exactly the level of scale you're talking about? How many humans do you think would be needed to handle copyright claims?


80 years of video is being uploaded to youtube every day so do the math


80 years, or 700,000 hours.

At $10 an hour that would be $7m a day, $2.5b a year.

Alphabet's revenue is arround $144b a year, so it would cost 2% of revenue to review every single uploaded video. Clearly that's not going to be needed -- you only need to review those which

1) Have a claim by $BIG_CORP

2) Have a counter claim


I don't want to do math with flawed data. Every minute of uploaded content is not going to face copyright strikes, and does not need to be reviewed.


if it works for large scale and profitable webhosting companies, it should work for youtube.

granted, this is not exactly apples/apples, but it's not far off


Until the new EU copyright law with the upload filters, YouTube didn't have to make the copyright takedowns as aggressive as it did. It went way above and beyond what the law was requiring. Why did it do that, you ask? Because if was part of whatever deal Google made with studios in order for them to give it access to songs for its failing music services.

One could also argue that if YouTube's takedown fitler wasn't as "good" (where good doesn't actually mean objectively good, but aggressive) as Google made it be, then EU's upload filter wouldn't have passed either, because then there would have been no example of anyone "doing it right" (read: taking down anything that smells like a cousin of a copyrighted work, including stuff like public works, bird chirps, etc -- just to be sure).

My point is, YouTube wouldn't have needed as many humans to check if people's taken down stuff was needed to be taken down, if its algorithms weren't designed to be so aggressive in the first place.

Google dug its own grave here. Now it's stuck between the creators who increasingly see it as a hostile/too risky service, and the people who keep calling for YouTube to censor stuff that "offends them", and who will never ever be satisfied with whatever censorship regime YouTube puts in place, just like the copyright trolls never will be either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: