Hacker News new | past | comments | ask | show | jobs | submit login

>Since then the DoD has started to take an interest in AGI, and in fact today, during my one weekend a month drill at the Pentagon, I had a great long conversation with Maj. General Seng [1] who is heading up efforts around implementing ISR systems with more autonomous capabilities and exploring how an AGI would be utilized in defense.

Isn't that basically the most obviously bad idea ever, so terribly stupid that there have been several movies chronicling how massively bad an idea it actually is to build "AGI" for military goals?




This idea is cliché precisely because it's the most obvious course of action for the military. Don't forget that it's DoD who sponsored the previous AI summer.

Besides, if they don't do on it, the enemy surely will!


Because arms races and cold wars with we-all-die-grade weaponry are an obviously excellent idea!

/s


If you still have a military (or the need for one) after you build an AGI, you built it horribly, horribly wrong.


I think you have been watching too many movies.


Movies sometimes get this thing right though. Don't forget that Dr. Strangelove pretty much turned out to be a documentary!


So your answer is, "Sure, let's weaponize brains, nothing can possibly go wrong"?


We already weaponize brains.


Well no, mostly we don't, because the brains who happen to be the weapons still need a world to live in after they fight. The difference is that with artificial brains, hey, why would they give a crap what comes after the killing?


However, we mostly only arm brains that have an IQ of <120 and some human values baked in (e.g. empathy to other humans and living beings; a family to protect, or dependence on other social structures that demand pro-social attitudes).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: