Hacker News new | past | comments | ask | show | jobs | submit login

> alcohol

Alcohol has an effect on the brain, which then results in effects on consciousness. It might seem like a pedantic difference, but we don't really have a solid grasp on how the brain results in consciousness, so I would imagine that investigating alcohol>consciousness has so many confounding variables involved is so complicated that we might not benefit from it.

Of course I'm certainly not under the impression that my college psych 101 class makes me any sort of expert, and I might be being far too pessimistic.




Seeing you took one psychology 101 class that makes you more of an expert than me :p

I'm more of a consciousness enthousiast..

I'm less interested in the question of "can computers be conscious?", and more interested in "what is consciousness". I believe taking the direction of the first question leads to a dead end, because even /if/ you build a consciousness computer, you still wouldn't have a way to prove it.

Experiment with your own consciousness, it's the only one that's provably observable to you. Based on this assumed truth, you can ask others to do the same.

Following the alternative path of experimenting with our own consciousness, I come up with questions like these, very fun to think about.. but I don't know what I can do with it. I'm no neuroscientist, or even a scientist at all. Are any of them actionable?

What would it take to add another qualia space to our every-day experience (and with qualia spaces I mean the range of qualia experienced that are linked to any sensory experience, all possible observed images fall into one qualia space.) Would it take another sensor attached to our body? Adaptations to our brain? Are the ones we have (vision, sound, smell, touch, internal feelings, emotions, heat, thought) all the qualia spaces there are?

Imagine a volume slider for your hearing. You know that louder sounds in the outside world means a louder experience of sound in your mind. What is the mind's volume limit? Is there even a volume limit? Can we reach it with by producing sound waves? Assuming the limit is far higher than whatever we have ever experienced in our every-day lives, what would it do to a person to hear a sound that is so much louder?

So much to think about!


> I'm less interested in the question of "can computers be conscious?", and more interested in "what is consciousness". I believe taking the direction of the first question leads to a dead end, because even /if/ you build a consciousness computer, you still wouldn't have a way to prove it.

I agree with you, but I don't think the first question is a dead end... suppose you devise and install a computer chip that can replace a tiny portion of your brain by interfacing with neurons around it, and over time it learns to perform the same function as the piece of brain tissue it replaced. If you think this is physically possible (even if not feasible with current technology), then by extension, over a long enough period of time you could replace every bit of biological nervous tissue with synthetic components. Assuming the synthetic components were able to wholly serve the same functions as the biological ones that they replaced, you would have a computer which was conscious, and you would have an inside perspective on it. :)

On the other hand, in theory it might be possible to do that complete nervous system replacement without actually understanding what consciousness is. So I agree that "what is consciousness?" is still a more interesting question. I found The Ego Tunnel[0] to be a good jumping off point for defining consciousness.

At this point I think the problem of consciousness is one of scope: any description of consciousness that fits into the the mental capacity allocated to our intuition will lack explanations for aspects of our conscious experience that we can easily identify with a few moments of introspection, and any more comprehensive description of consciousness will exceed our ability to intuitively grasp as a whole. I don't think this is a fundamental limitation; I think it's just a reflection of how much base knowledge is necessary to build up adequate intuitions because of the complexity of the system.

[0]: https://www.amazon.com/dp/B0097DHVGW/ref=dp-kindle-redirect?...


I've never found that argument convincing.

It seems to suppose that consciousness is an either/or phenomenon. Either you're conscious or you're not. But what if consciousness is more of a continuum and the intensity of your conscious experiences can increase or decrease. Further let's suppose that the source of consciousness is somehow distributed throughout the brain. When the first small set of circuits is replaced your conscious experiences became ever so slightly less intense, by such a small amount that you didn't even notice. This continues every time another small brain region is replaced until at the end you no longer have any conscious experiences at all.


Yeah but that's still considering the epiphenomenon hypothesis and not other possibilities. What if the only thing you can explain with the epiphenomenon idea is the Philosophical Zombie (mind) but you still didn't explained a bit about consciousness.

Ref: https://www.youtube.com/watch?v=NK1Yo6VbRoo


My intent was actually not to take any firm position on the nature of consciousness but rather to present one possible model, that I think many would find plausible, as a counterexample to the progressive replacement argument (ie. the argument that a machine can be conscious because you can gradually turn a human brain into a machine).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: