Hacker News new | past | comments | ask | show | jobs | submit login

>> The generally accepted definition seems to be "having better than human performance"

I don't think there's a generally accepted definition and I don't agree that performance on its own is a good measure. Humans are certainly not as good at mechanical tasks as machines are -duh. But how can you call "superhuman" something that doesn't even know what it's doing, even as it's doing it faster and more accurately than us?

Take arithmetic again. We know that cat's can't do arithmetic, because they don't understand numbers, so it's safe to say humans have super-feline arithmetic ability. But then, how is a pocket calculator super-human, if it doesn't know what numbers are for, any more than a cat does? There's something missing from the definition and therefore the measurement of the task.

I don't claim to have this missing something, mind you.

>> Why would an intelligence built by humans not be able to be superhuman?

Ah. Apologies, I got carried away a bit there. I meant to discuss how I doubt we can create superhuman intelligence using machine learing specifically. My thinking goes like this: we train machine learning algorithms using examples; to train an algorithm to exhibit superhuman intelligence we'd need examples of superhuman intelligence; we can't produce such examples because our intelligence is merely human; therefore we can't train a superhuman intelligence.

I also doubt that we can create a superhuman intelligence in any other way, at least intentionally, or that we would be able to recognise one if we created it by chance, but I'm not prepared to argue this. Again, sorry about that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: