As my six year old son watched one of the videos he asked "Dad, when will the robots be able to stop working for them and be free?" To say I was blown away is an understatement... that was pure human emotion and honesty as he's never seen/read any sci-fi with such questions presented to him.
Based on what? We evolved from chemicals to single celled organisms, to multi celled, to worms, to fish, to mammals, to apes, etcetera. At which point in this process did we become more than our physical body? I don't see any reason why an organism that came to be through evolution is fundamentally different than a machine that we build. In both cases it's the laws of nature at work within the universe. I think rather than concluding that we must be more than physical, a more reasonable conclusion is that there's something that we don't yet understand about the physical world, i.e. how matter in the physical world can give rise to subjective experience.
> I think rather than concluding that we must be more than physical, a
> more reasonable conclusion is that there's something that we don't
> yet understand about the physical world, i.e. how matter in the
> physical world can give rise to subjective experience.
What you're fumbling towards here is called the "hard problem of
consciousness"[1]. Essentially some believe that there's some secret
sauce in the brain that gives rise to consciousness, while others
believe that there's no such "hard" problem and that "consciousness"
is just what we call a bunch of simpler brain systems working in
unison.
My opinion is a layman that's read quite a bit about this is that
there's no "hard" problem, and the only thing that separates us from
"lower" animals is just that we're running a slightly more
sophisticated version of the same physical brain processes, which we
like to call consciousness.
There's no reason to believe that we're anything than the cumulative
behavior of our brain matter, or that we're somehow "special" in the
animal kingdom in another sense than e.g. wolves are more special &
sophisticated than mice.
I agree that we're not special, but the hard problem is explaining why subjective experience exists at all. What are the arrangements of physical matter that give rise to it? Here's another fun thought. Presumably if we built a human by artificially placing all the atoms in the right place, then they would have subjective experience. What if instead we simulated that on a computer? The simulated human would act the same as a real one, including expressing wonder about how it is that they have subjective experience. Would the simulated human have real subjective experience? No way to find out as far as I can tell. You can't even find out whether I have true subjective experience. Maybe I'm just a machine acting as if I do. So maybe it's a nonsensical question, but I have never found a satisfying explanation for why exactly it's a nonsensical question.
> Would the simulated human have real subjective experience? No
> way to find out as far as I can tell.
I think that once you hold the view that living beings aren't "magic",
that our behavior / wants / needs are just a result of the sum of our
hardware as it were, you'll quickly realize that talking about "real"
experiences doesn't make sense.
Extrapolating from a simpler system can help to give clarity,
e.g. does this Lego Robot who's running a complete simulation of a
round worm's neural network feel "real" hunger when its food sensory
neurons are stimulated?
Some might have the knee-jerk reaction of saying something to the effect of "of course it's not real, it's just a simulation running on some robot, it's not actually hungry".
I think that falls apart when you consider the following: Suppose I had the technology to take an existing round worm and slowly replace its individual cells & copmonents with artificial equivalents, at what point would it stop having "real" feelings of hunger or anything else?
I think the only sensible answer is to realize that the question doesn't make any sense. No more than asserting that moving a computer program from one computer to another makes its execution any less "real".
> You can't even find out whether I have true subjective
> experience. Maybe I'm just a machine acting as if I do. So maybe
> it's a nonsensical question, but I have never found a satisfying
> explanation for why exactly it's a nonsensical question.
I think if we mapped your entire biology & neural network and uploaded
it to the proverbial Lego Robot we could easily find out a lot of
things about what's happening with your experience of the
world.
E.g. maybe you're in chronic pain, maybe you're hungry. Those things
are just the result of biological processes in your body &
brain. There's no reason to think subjective experience is anything
inherently different.
Science has answered questions that we previously thought couldn't be answered many times, such as what the age of the universe is. Maybe the question of consciousness is different, but I wouldn't be surprised if science had some interesting things to say about it in the future.
You may not have heard about this, but extremely smart people have been arguing about this problem for thousands of years. I guarantee you it won't be solved in this thread.
It may be the case that every process is purely chemical and physical, and it most probably is, but it is not a useful way to understand the world. Even if merely conceptual, a metaphysical side to our essence does exist. What's trancendental might indeed be an illusive physical phenomenon, but we're sort of trapped into it, even if we understand it one day, the subjective experience that is, our understanding will happen within our subjective experience, so why try to deny it? I think it's better to embrace it. Karen Armstrong has produced some very interesting reading on the topic [1], though I do not agree her on each and every point she makes.
BTW, when I say metaphysical and trancendental, I do not refer to religion and divine existence. I am a bit close to Sartre, though not completely, I'm adding this just to clarify my stance a bit more.
I love to play with words. They often convey meanings. I'm French and "the death" translates into "la mort". When you say "la mort", you hear "l'âme hors" (the soul out), which tells you we are a soul into a body. And that's what almost all religions teach : we come from the spiritual world and we are here to experience the physical word - and not the other way around.
But yes, it's a huge philosophical/metaphysical/religious debate. We can't really address this topic here.
Whether or not humans are "more than" machines is a vast subject of debate.
What is interesting right here, however, is the fact that watching this video many people do feel some sense of empathy for the robot.
The editing of the video deliberately and perhaps inexpertly(1) heightens that, but I think we'll be surprised (as a society) at how we "feel" when robots really start walking the Earth.
(1) The recent film "Ex Machina" explores this topic at some depth, and very expertly.
Sounds to me like you're saying the physical machine gives rise to something else beyond the machine. In which case, why waste energy insisting we're only machine?
Because there is a difference between being more than physical, and the physical giving rise to the subjective experience. Being more than physical means that we are physical plus some special sauce. This would mean that a robot that we build probably lacks said special sauce, because we didn't put the special sauce in. If on the other hand it's the physical that gives rise to the subjective experience, then there is no reason to think that robots couldn't have it.
Sure, that's fine. We could then say, however, the robot is more than a machine once it has this new special feature. A special feature that may have arisen from a specific arrangement of the robot's material makeup.
Either way, "more than machine" will help these new love-enabled robots have "human rights" when they eventually do need to stand up for their rights, and wish to do so beyond their programmed directive.
Sometimes when I'm stuck on the puzzles, I leave the game for a few days then come back and solve it within 5 minutes. Did the same thing solving javascript bugs sometimes.
There is a similar phenomenon at work with the Witness too. I've heard it attributed to the fact that the brain continues expending energy on unsolved problems even subconciously. Not sure if that's true or not.
It couldn't possibly take over if vengeance against guys with hockey sticks was a primary utility-function-maximizer. Perhaps in Canada, but it would be easy to defeat, having never trained against artillery and whatnot.
https://www.youtube.com/watch?v=G_P-zl8QKp0