Hacker News new | past | comments | ask | show | jobs | submit login

As if that's even remotely feasible.



Of course it is. It will far bit of money and man power but it's not infeasible by any stretch of the imagination.


Feasible = economically feasible. According to some youtube stats on average every minute 10 hours of video is uploaded, that means that you'd have to have people screening 2 manyears of video every day, that translates if you take into account three shifts of workers to about 2000 ftes.

If one of those happens to be slacking on the job when that privacy invading video comes by you are still open to a lawsuit, and now you have a much bigger problem liability wise because by screening you took responsibility for the content.

That 10 hours every minute is a november 2006 figure, no idea what the current amount is.


First of all, videos originating in Italy would be a substantially smaller subset of the global figures so I don't believe the task is so incomprehensibly vast as you may think.

Second, I think the legal complaint here is that there is no attempt at screening the content until it has already been published.

I obviously don't agree with it, but it's what's being contested by the plaintiffs. And I think just showing you have safe-guards in place would show a proper intent and meet a standard of reasonable prudence.

But back to the technical issue of screening.

The solution is easy. Drop the realtime requirement. Say:

"hey, sorry Italians, we have to screen everything, we've got 100 people working on it around the clock and your video will get uploaded in two week's time. We'll email you when it's ready."

It'll make youtube less attractive, sure, but every other video publisher (vimeo, justin.tv, et al) would have to do the same thing so all in all, the courts will be happy, the italian users will have a diminished service, and the world will spin on.


Dropping the real-time requirement doesn't help. If you don't have enough resources to screen all of the videos you receive today in 1 day, your queue just gets larger and larger -- eventually large enough that an uploaded video will take a few years (or more) to be posted.


From some other pages about this case:

(italics mine):

"Prosecutors say they are aware Google cannot screen all videos, but maintain the company didn't have enough automatic filters in place as well as warnings to users on privacy and copyright laws. They also say Google didn't have enough workers assigned to its Italian service in order to react quickly to videos flagged as inappropriate by viewers."

So, they already know screening isn't feasible, for the rest it's really only about a matter of degree, and that should carry jail terms ?


I'm not defending the prosecution's legal position, only your claim that it would be infeasible to screen the content.


You're entitled to your opinion that it's feasible.

Just as I'm entitled to mine, and it seems to be shared by Google, and the prosecution in this case.

Of course that is not a guarantee that I'm right.

But let's just say that maybe for 5 years or so I operated an office with people screening live video and because of that I'm all too aware of how fallible that is. You'd have to screen double, have an open channel between your screeners and the uploaders, and you'd need positive identification of everybody in every video uploaded.

And our screening parameters were a lot less strict than what google/youtube would have to operate under in order for this to be done so that in the future this could not have happened.

For one, any video that has a person in it could be construed as a breach of privacy, so now you have to figure out who that person is, maybe ask their permission and so on.

The burden of proof that a video is ok to upload should lie with the uploader, only they have the ability to make that call, everybody else is missing just too much context.

And why would a child with Downs have different privacy rights than anybody else ?

So a ruling that would be favourable to the plaintiffs would quickly be seized by follow ons from other people that felt that their privacy was somehow violated, possibly in different media (text ? photographs ?) and so on.

If there ever was a slippery slope example than this would be it, and I think it really ought to stop right where it is at this point.


Probably the easiest thing to do would be to block access to Google video services in Italy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: