Hacker News new | past | comments | ask | show | jobs | submit login

Of course, if you take your average city dweller on your island, he will probably die of thirst before the tigers get to him. But take an (unarmed) navy seal as your human on the island and I'm pretty sure in a couple of months he will be the Tiger King.

And Hawking would just ask his assistant to put the cat into the box. You are artificially depriving him of his actual resources to make a weak point.




Navy SEALs are not superhuman. A single adult tiger would slaughter just about any unharmed human with near certainty, even a SEAL. Even armed with any weapon other than a firearm, the chances of besting a tiger and coming out without being maimed or mortally wounded is close to zero.


Why so afraid of the 400 pound killing machines?


What percentage of people are Navy SEALs?

If I were a tiger, I'd probably think this island sounded like an excellent place for a fun adventure holiday with my tiger friends, and I'd be right to do so. Most likely outcome: good food, some exercise, and we can take some monkey heads home for our tiger cubs to play with.


Fun, until the navy seal appears. It takes only one.

The same thing with AI, it takes only one, no matter how many dumber ones we entertained ourselves with before.


What's up with your Navy SEAL analogy in these comments? Do you know any SEALs? Have you ever seen a tiger up close, say 20 feet or so?

One adult tiger will kill an unarmed SEAL (and any other human being) in single combat. It would barely exert itself. It would make more sense if, when you said "it only takes one", you were referring to how many tigers could kill five or six SEALs who don't have guns. Fuck it, give your SEAL a fully automatic weapon - the odds are still not in his favor against a single tiger. Large felines have been known to kill or grievously injure humans after taking five high caliber rounds.

This is exactly what idlewords' point is in his essay. Your argument about a SEAL landing on an island of tigers and somehow flipping a weird "Planet of the SEALs" scenario on them is exactly what many (most?) AI alarmists do. These ridiculous debates get in the way of good faith discussion about the real dangerous of AI technology, which is more about rapid automation with less human skin in the game than it is about the subjugation of the human species.

This sort of unrealistic scenario is really fun to talk about, but that's all it is - fun. It's not really productive, and it conveniently appeals to our sense of ego and religious anxiety. Better to do real work and talk about the problems AI will cause in the future.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: