Hacker News new | past | comments | ask | show | jobs | submit login

> Why can't I ever wake up in someone else's consciousness? How does "my" consciousness know to come back into "my" brain whenever I lose it (through sleep or injury, etc).

Why doesn't the fire in your car's engine suddenly appear in the engine of a car down the street?




This is why I say I'm not able to communicate my thought very well. Your statement is simply the obvious, this is not what I'm trying to get at.

It's more like... why did my consciousness decide to attach itself specifically to "my" body, and only my body? Is it REALLY only attached to my body? Each of us are simply a collection of atoms - as Carl Sagan says, we are the universe trying to understand itself.

I have consciousness AND I'm just merely made up of atoms, while every other person is also merely made up of atoms. Does that mean my consciousness could have randomly ended up inside any other "being"? And would that consciousness be the "same" me, just in a different body? OR would "my" consciousness be affected and come out differently if it were filtered by a different being? This is kind of what I'm trying to get at.

Someone else on this thread said something along the lines of they think of consciousness as a field that permeates everything. This is along the lines of what I'm thining. Consciousness is the same, we are all the same consciousness, simply filtered in different modules.


Pretty much this. Your "consciousness" is nothing but neural activity in your own brain. By definition, it cannot appear anywhere else but your brain.


That is speculation.

We have absolutely no idea how consciousness is produced from electrical and chemical signalling between a large ball of fats.

We know that it is (because we perceive our own), but we can't map from one thing to another.

If consciousness is merely neural activity in a brain, then why can't we simulate really simple brains?


Because really simple brains don't have consciousness, they aren't sufficiently powerful enough to have the "mental machinery" needed for consciousness.

We know that brains can implement things like theory of mind, self-awareness and abstract concepts; these particular things don't require a human mind but they do appear only in animal brains above some level of complexity, and don't appear in brains so simple that we'd have the power to simulate them in 2020.


We don't know that. How would we ever observe such a thing?

Like the core problem of consciousness is that we experience it, but have no way of identifying it otherwise.


Reductionism is by definition less than a full perspective on life, humanism and depth of possibilities.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: