Hacker News new | past | comments | ask | show | jobs | submit login

I disagree there's "too much to do" when it comes to having humans moderate reported or detected cases of terms violations.

We have a near crisis with people having a harder and harder time finding jobs that don't require college degrees. A lot of older job fields machines can do just as well, so a lot of old jobs are going away. The people are there.

Google is sitting on many billions of dollars it just doesn't want to spend (like $100B or so, I believe?). The money to pay for people is there.

However, Google would not be nearly as profitable if it wasn't automating away jobs that as we've continually seen, automation is pretty bad at doing. This is a business decision, plain and simple. New jobs for humans aren't opening up because tech companies are insisting on using automation to do things automation isn't good at.




At 300 hours/minute they'd need 18,000 pairs of eyes judging videos constantly (not all videos need to be judged immediately, but that's fine). Assuming a normal employee works 50 weeks of 40 hours a year, that means that each employee is only working about 23% of the time. In the end they'd need about 80,000 employees to judge everything that gets uploaded. Wikipedia says they had 57,100 employees in 2015. With inefficiency (HR, payroll, managers, manager-managers, devs to write video audit software, etc), they'd need to triple their workforce solely to keep up with youtube.


As both I and the parent suggested, you'd only need humans reviewing content that is reported or automatically flagged. And you'd have a second level set of people for appeals handling.


Right but if, like is happening in this case, the owner of the video intentionally does not appeal... then what?


Money in the bank is not the same as cash flow positive. If YouTube had to spend money in the bank to verify videos, such that the whole enterprise was net cashflow negative, eventually the bank account would be depleted, why the hell would anyone run such a business?

If human curation has to work, it has to work and still generate profits, otherwise you're running a charity on its way to bankruptcy, not a business.

As to your other claims, it's based on assumptions. The feasibility is based on how often things are flagged. 300 hours per minute, if every minute of every video had to checked manually, would require 18,000 people continuously watching video 24 hours a day, or 54,000 if you consider an 8 hour workday. However it is unlikely that handling a video simply requires the duration of watching it, as there are gaps in between as well as "stalls" when you actually come across something questionable. I'd say it takes 5 minutes per 1 minute of view if you want to be conservative. That raises it more to like 250,000 people required for 100% curation.

But not even video is flagged, if you have to sample even 1% of those videos, you're talking 2500 full time employees. That's going to cost about $500mil a year with pay and benefits.

And even then, you will have false positives, lots of them. Because reviewers have subjective evaluation of things like hate speech, racism, or pornography. I've seen political channels I follow like the Young Turks or Secular Talk get flagged and punished by human curators who considered some of their videos to cross the line.

So if you really really want to be sure, you can't actually depend on a single reviewer. Flagged videos need to be peer reviewed by a panel of people, and those people usually have to know the CONTEXT of the video they're reviewing, because satire can go undetected, and say, a video designed to expose extremist by parodying it can be falsely seen by some as extremist without context.

This is no where near as simple as you flippantly make it out to be. It would be extremely expensive and still end up with many false positives, and even one or two high level VIP false positives gets you the same negative PR.

I personally think it's better to just tag videos, and let people filter them on their own, than banning it. Let porn be hosted on YouTube, but just tag it, and make the tag filtered by default, which you can remove with a safe search filter removable.

In other words, it should work the same way porn works on Google.com. I mean, if you let your kids surf Youtube because Google bans porn, but don't ban them from using regular Google search, you really think they can't find porn?


As I said, I definitely don't see humans manually reviewing all uploads, just flagged content. I don't have the statistics, but I'm guessing it's less than 1%. (Assuming you don't count the deluge of DMCA requests.)

Your $500 mil figure suggests paying people a lot more than I would pay people to watch YouTube videos. :P

It is not simple, I agree, but it is also not unreasonable: Google makes a significant amount of money operating these platforms, it's reasonable to expect them to build reasonable management practices in that formula.

I agree with you, however, that an ideal case would simply be to not moderate content on YouTube beyond a SafeSearch filter type behavior. Although that would invite a whole additional set of complex problems.

Right now, one of the big ways YouTube moderates content is to demonetize certain videos if they contain objectionable content their advertising customers do not want to be associated with. I believe Google currently takes a loss on the content that falls into this category.

But if porn was on YouTube, there would be a LOT of porn on YouTube. A truly unholy amount of porn. If there can be porn on something, there is a lot of it. So presumably this means Google would need to start selling adult content ads as well.


The only public statistics available are 90 million unique people have flagged videos, and 1/3 of those have flagged multiple videos. It's hard to generalize from there because you don't know how many of these watchers are disjoint, but in the worst case, 60 million unique flags would be able 0.5-1% of all YouTube videos.


> Google is sitting on many billions of dollars it just doesn't want to spend (like $100B or so, I believe?). The money to pay for people is there.

So now you are saying that the only companies that should host video are those that can afford to lose billions by manually checking every upload?

Would this rule have applied to youtube before they became part of Google and had a few billion spare in the bank?

OTOH the regulations around video hosting probably already make it almost impossible for a serious competitor to emerge




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: