Hacker News new | past | comments | ask | show | jobs | submit login

I was alluding to a popular notion of AI that: It isn't AI if it doesn't feel like a human to us. I agree there is valuable knowledge to be gained in brain simulation and DNA computing.

AI press coverage has a tendency to "humanize" all AI research. (So your neural network is on par with a rat brain? Can we add such chips to our brain and create androids?)

I understand why this happens -- to make it more accessible, get more PR and to get more research grants -- but I do think it is silly. It makes people ask "When will we have AI?" when AI is already here, but they don't accept it, because it lacks human emotions. By taking it "too far", some make human intelligence a prerequisite for AI, when true AI could be just as well alien. Some AI researchers have no qualms in playing along with this notion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: