Hacker News new | past | comments | ask | show | jobs | submit login

Deciding whether nudity (or anything, really) is "artistic" is hard, and probably doesn't scale. There may not be a good solution for this problem in the general case.



Maybe this isn't a problem we should be trying to solve with computers right now, then. We've managed to survive as a species and as a western society for years making these kinds of determinations using people, maybe we should continue to do that.


Part of the failure here was in the human organization attempting to determine "artistic merit". The definition of art has been debated for centuries. Contractors aren't always going to make the right call, even if they do have a better rate than a computer.

I don't know why everyone seems to think I'm only talking about technical solutions. Human solutions have to scale, too.


The problem is then that the only possible way to implement that is crowdsourced voting/flagging, and enough people out there don't get some kinds of art that they would flag it that way. I think most controversial modern art would not survive on a platform like Youtube where everyday people make a determination on the palatability of it.


That's just plainly false. It's entirely possible to implement it using employees that are human reviewers.

If that's not profitable then they should rework their business model until it is profitable.


It is entirely possible that the numbers don't work. From a business point of view if the only downside is that some art videos get incorrectly categorized then correctly categorized after a manual review then reworking their business model doesn't make much sense.

This is bad PR for sure but I don't think the system is failing that badly. The rate at which video is uploaded is truly stagerring, and that's assuming that the successes of the automation deters many bad actors.


You are wrong. You do not understand how much media is being uploaded to Youtube. All of Mechanical Turk working together simultaneously could not filter all the stuff being uploaded. 300 hours of video is being uploaded to Youtube every minute. [0]

[0] http://lmgtfy.com/?q=how+much+content+is+uploaded+to+youtube...


Or maybe people should grow up and realize that the human body isn't something that should be shocking. Especially since nobody seems to complain about bloodshed and decapitations on videos.


Google Maps is crowd sourcing business and location data. I wonder if YouTube couldn't do the same, rewarding (like Maps does) YouTube Red subscriptions for enough effort.

Then they could have more granular controls in clients, like saying "Block videos where at least half of the crowd say there is sexual content".


I'm starting to like this idea better. It's at least an approach that isn't yet a proven failure. You can probably get some mileage out of weighting reliable users more heavily than randos, being smart about what thresholds constitute sufficient evidence, etc. Existing systems are definitely good at that sort of thing.


I was thinking the users could set their own thresholds, just like they do when searching with date and resolution ranges.


The problem you might have is that most who search for terms that would bring up similiar results will have a vested interest in sexual content and will produce false flags


Not to mention when you can crowdsource banning content, competitors will intentionally flag content that competes.


I didn't say banning content. I suggested categorization. Like the categories in the detailed MPAA breakdowns ("violence", "sexual content", "language", etc.).

Maps asks more opinionated questions as well. YouTube alternatives would include "Good for kids", "Funny", "Good on a small screen", and things like that.


What do you do when people flag random things they don't like as "sexual content", solely so that they would be less visible?


Same things you do if someone puts the wrong business hours in the Google maps databases.


> There may not be a good solution for this problem in the general case.

Surely the good solution is: decide whether you want to be 'family-friendly' at the expense of suppressing valid but possibly offensive art, or 'expression-friendly' at the expense of losing easily offended parents, and then do that. No technical solutions to identifying nudity involved. (I'd say YouTube's decision is "don't offend anybody", which is, of course, not really a decision.)


Neither of those are really good options, despite your attempt to spin it. Do you think a parent is merely being "easily offended" if they don't want their kids watching "Fifty Shades of Grey" or the equivalent on YouTube? I agree that the other extreme isn't much good either. That's why it's hard.

Note that no one said anything about a technical solution. Google's human review process is what failed here, and that's what I'm saying is likely to continue.


If a parent doesn't want their child watching certain types of content the parent needs to monitor the child and not rely on youtube filters.

If you do not approve of 50 shades it is up to you to police that. Another parent might be offended by religious vidoes. Another by non religious. Some may not allow Trump in the home while others might burn any Hilary comeback book.

There isn't a technical solution to parenting.


> If a parent doesn't want their child watching certain types of content the parent needs to monitor the child

Why is this so hard for people to grasp? It's not the job of other people to raise your kid(s) for you.


The problem is the connected and omnipresent nature of the web means there's a constant, loud sea of voices and content you have to war against, one that's really hard to control or stop. In the old days, you worried about dirty magazines under the bed, but the net has an infinite supply of the dirtiest content that's more or less trivial to find.

The culture before often affected your kids a lot, but in the 70s or 80s it was local and controllable. Now its distributed and universal, while parenting is more or less the same. The old rules and chestnuts no longer apply.



Do you really wanna say the internet is part of your village in the context of raising a child?


No, I was just giving a possible reason as to why people might act like its the role of others to raise their children. I neither support it, nor wish it extended to the Internet.

I'm not actually sure why you'd think I was supporting it? I'm merely showing why it may be that way.

In my experience, it became fairly popular to say/believe such in the early 70s, after the hippies took their clothes off and moved into the woods for a short while.


Unless there's some kind of technical tool, that amounts to standing over the kid's shoulder any time they're touching a computer and stopping them from tapping the wrong buttons. Besides being difficult, this sort of thing causes its own problems from a parenting standpoint. So yeah, still not a good solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: