Hacker News new | past | comments | ask | show | jobs | submit login

I did not intend to provoke painful emotion.

>It's neither a rifle nor a turtle.

I disagree. Would also say the pistol is no longer a pistol?




That doesn't follow because the pistol painted to classify as a lunch box is still a pistol.

The "Turtle" is a plastic replica of a turtle not a real turtle. It's the treachery of images idea - "Ceci n'est pas une pipe."

Humans see its form and recognize the plastic replica to be a representation of a turtle because we prioritize its shape over its textured image which seems more correct to us, but I'm not sure that it really is more correct in some objective way. In this case I suppose you could say it is because a turtle is what we mean for it to represent, but the test seems rigged in favor of human visual classification.

I think an interesting question is what adversarial attacks exist on human vision that may not affect a machine (certain optical illusions?). If we're also vulnerable to this kind of manipulation then it may not be something unique to computer vision we may just be picking test cases that we're better at. Then it's just a matter of tradeoffs and deciding when we want human style classification.


It's a plastic replica of a turtle with an artificial rifle texture.

The human's error is in missing the texture. The computer's error is worse, it misses the turtle and thinks the texture is an actual rifle.


I agree, but it's an unfair test - it was designed to confuse the computer and not the human.

For a counter example - imagine that you make a toaster that looks exactly like a pistol, but it actually just toasts bread.

A human would think it's a pistol when looking at it (so would the machine in this case). There may be adversarial examples where the human classification is worse than the machine if you specifically try and make examples that are bad for the human visual system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: