Hacker News new | past | comments | ask | show | jobs | submit login

Well, humans aren't machines. Why would a definition of AGI need to apply to humans? On the other hand, as we gain the ability to edit our DNA, I think recursive self-improvement over generations is on the table for our species.

I guess it would be useful to have a definition of weak AGI, but after reading Bostrom's Superintelligence, I struggle to imagine an AGI without a singularity or intelligence explosion. It seems like wishful thinking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: