> I think rather than concluding that we must be more than physical, a
> more reasonable conclusion is that there's something that we don't
> yet understand about the physical world, i.e. how matter in the
> physical world can give rise to subjective experience.
What you're fumbling towards here is called the "hard problem of
consciousness"[1]. Essentially some believe that there's some secret
sauce in the brain that gives rise to consciousness, while others
believe that there's no such "hard" problem and that "consciousness"
is just what we call a bunch of simpler brain systems working in
unison.
My opinion is a layman that's read quite a bit about this is that
there's no "hard" problem, and the only thing that separates us from
"lower" animals is just that we're running a slightly more
sophisticated version of the same physical brain processes, which we
like to call consciousness.
There's no reason to believe that we're anything than the cumulative
behavior of our brain matter, or that we're somehow "special" in the
animal kingdom in another sense than e.g. wolves are more special &
sophisticated than mice.
I agree that we're not special, but the hard problem is explaining why subjective experience exists at all. What are the arrangements of physical matter that give rise to it? Here's another fun thought. Presumably if we built a human by artificially placing all the atoms in the right place, then they would have subjective experience. What if instead we simulated that on a computer? The simulated human would act the same as a real one, including expressing wonder about how it is that they have subjective experience. Would the simulated human have real subjective experience? No way to find out as far as I can tell. You can't even find out whether I have true subjective experience. Maybe I'm just a machine acting as if I do. So maybe it's a nonsensical question, but I have never found a satisfying explanation for why exactly it's a nonsensical question.
> Would the simulated human have real subjective experience? No
> way to find out as far as I can tell.
I think that once you hold the view that living beings aren't "magic",
that our behavior / wants / needs are just a result of the sum of our
hardware as it were, you'll quickly realize that talking about "real"
experiences doesn't make sense.
Extrapolating from a simpler system can help to give clarity,
e.g. does this Lego Robot who's running a complete simulation of a
round worm's neural network feel "real" hunger when its food sensory
neurons are stimulated?
Some might have the knee-jerk reaction of saying something to the effect of "of course it's not real, it's just a simulation running on some robot, it's not actually hungry".
I think that falls apart when you consider the following: Suppose I had the technology to take an existing round worm and slowly replace its individual cells & copmonents with artificial equivalents, at what point would it stop having "real" feelings of hunger or anything else?
I think the only sensible answer is to realize that the question doesn't make any sense. No more than asserting that moving a computer program from one computer to another makes its execution any less "real".
> You can't even find out whether I have true subjective
> experience. Maybe I'm just a machine acting as if I do. So maybe
> it's a nonsensical question, but I have never found a satisfying
> explanation for why exactly it's a nonsensical question.
I think if we mapped your entire biology & neural network and uploaded
it to the proverbial Lego Robot we could easily find out a lot of
things about what's happening with your experience of the
world.
E.g. maybe you're in chronic pain, maybe you're hungry. Those things
are just the result of biological processes in your body &
brain. There's no reason to think subjective experience is anything
inherently different.
Science has answered questions that we previously thought couldn't be answered many times, such as what the age of the universe is. Maybe the question of consciousness is different, but I wouldn't be surprised if science had some interesting things to say about it in the future.
My opinion is a layman that's read quite a bit about this is that there's no "hard" problem, and the only thing that separates us from "lower" animals is just that we're running a slightly more sophisticated version of the same physical brain processes, which we like to call consciousness.
There's no reason to believe that we're anything than the cumulative behavior of our brain matter, or that we're somehow "special" in the animal kingdom in another sense than e.g. wolves are more special & sophisticated than mice.
1. https://en.wikipedia.org/wiki/Hard_problem_of_consciousness