Let's say we create an AI that can think for itself.
There's a fear I think, that lurks in people's subconscious that ... what if the AIs, upon their own initiative, decide that humans are wasteful, inefficient beings that should be replaced? I think that comes from a guilt shared by a lot of folks, even if it never reaches the surface.
Another side is, suppose an AI can think for itself and it thinks better than humans. Upon its own initiative, decides that humans are stupid and wasteful, but there is room to teach and and nurture.
In either case, I think that speaks less of AIs and more about human nature and what we feel about ourselves, don't you think?
There's a fear I think, that lurks in people's subconscious that ... what if the AIs, upon their own initiative, decide that humans are wasteful, inefficient beings that should be replaced? I think that comes from a guilt shared by a lot of folks, even if it never reaches the surface.
Another side is, suppose an AI can think for itself and it thinks better than humans. Upon its own initiative, decides that humans are stupid and wasteful, but there is room to teach and and nurture.
In either case, I think that speaks less of AIs and more about human nature and what we feel about ourselves, don't you think?