>Since then the DoD has started to take an interest in AGI, and in fact today, during my one weekend a month drill at the Pentagon, I had a great long conversation with Maj. General Seng [1] who is heading up efforts around implementing ISR systems with more autonomous capabilities and exploring how an AGI would be utilized in defense.
Isn't that basically the most obviously bad idea ever, so terribly stupid that there have been several movies chronicling how massively bad an idea it actually is to build "AGI" for military goals?
This idea is cliché precisely because it's the most obvious course of action for the military. Don't forget that it's DoD who sponsored the previous AI summer.
Besides, if they don't do on it, the enemy surely will!
Well no, mostly we don't, because the brains who happen to be the weapons still need a world to live in after they fight. The difference is that with artificial brains, hey, why would they give a crap what comes after the killing?
However, we mostly only arm brains that have an IQ of <120 and some human values baked in (e.g. empathy to other humans and living beings; a family to protect, or dependence on other social structures that demand pro-social attitudes).
Isn't that basically the most obviously bad idea ever, so terribly stupid that there have been several movies chronicling how massively bad an idea it actually is to build "AGI" for military goals?