Hacker News new | past | comments | ask | show | jobs | submit login

I love how people think because we are getting very good at efficiently encoding human intelligence that implies that we are very close to creating superintelligence, and that our progress on creating superintelligence will somehow resemble the rate of progress on the simpler problem of encoding existing intelligence.



If we can create a human-level intelligence in the computer - it would already be superintelligence. No human on Earth is capable of reading and remembering Internet scale corpus of data, or doing math at GHz speeds, etc.


When it comes to speed, the comparison I like to use is that transistors are faster than synapses by the ratio to which a marathon runner is faster than continental drift.


If we can match our existing intelligence (but it’s a jagged border of capabilities), our progress in creating superintelligence won’t matter because we won’t be the ones making it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: