Hacker News new | past | comments | ask | show | jobs | submit login

The scenario you described is just suicide. An AI that is acting on behalf of a controller and has no ability to make autonomous decisions is just a tool. To me that's no different conceptually than a race destroying itself with nuclear weapons, but simply replacing nukes with some sort of drone or automated weapon. It wouldn't be AGI.



I agree, I'm not saying it would be AGI, just that it would make AI a solution to the Fermi Paradox.


Absolutely wrong. Heroin is just a tool to get high, so obviously a drug addict can choose whether to use it, right?

The problem is that technology like AI provides short-term, economic advantages to those who use it, regardless of the long-term consequences. It is a prisoner's dilemma situation however, because if everyone uses it, the situation gets worse.

It is through a series of these prisoner-dilemma type situations that technology advances. Yes, individuals can choose to use it or not, but instinctually we do use it because each step provides an improvement for the individual, even though as a society we get worse off. Thus, as a society, we cannot choose to use it.

The problem is that individual choice for individual gain does not equate always to an emergent societal choice for societal longevity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: