Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, that's true but still doesn't really explain a hallucination. It's like seeing an infrared video of a complex machine like a car. Ok, so the main heat seems to happen in the front part. Does that explain it?

As one who has hallucinated intensely, the experience is not just about what your senses create for you while certain neural structures are excited. Often times it's about what is left when your senses have gone off the deep end. For many it's a deeply spiritual experience, some experience a total ego death. Others experience lifetimes in different bodies, working jobs and having families that never existed here.

So while it's interesting to know something about what the neurons are doing, I don't think that gets us any closer to an explanation.




But none of those things you cite are intrinsically outside of neural structures. To be clear, explaining a hallucination doesn't require that one explains consciousness. Consciousness is a background assumption. But given consciousness, hallucinations are merely abnormal patterns of neural excitations, some of which influence conscious experiences. What is "left over" after a psychedelic experience could be due to new memories that give one a fresh interpretation on typical experiences, or new connections made in the brain from the over excited state that induces new patterns of thinking.


But you don't actually _know_ none of the things I've cited are intrinsically outside of neural structures. Nobody actually _knows_, except the very religious. It might be a reasonable assumption, but at that level I'm not sure it's simpler an assumption than that there are things about the universe we don't understand that many naively file under "extra dimensions".

If you want to compare two explanations for the experiences of one who is hallucinating where one invokes extra "dimensions", and the other invokes "hallucinations mean you hallucinate", and you want to say one is obviously simpler than the other, from a certain perspective you might be right but from where I sit you're leaving a lot of potential discussion on the table.


> But you don't actually _know_ ...

An interesting observation I've had is that there seems to be something about the nature of human consciousness such that people are ~literally not able to fully grasp the idea that they often/usually don't actually(!) know what they think they know, with high accuracy. With some people, depending on the topic (it seems) of conversation, they are sometimes able to switch to an abstract mode of thinking and realize and admit that yes of course, they do not really know with 100% certainty that "<X> is True"...but often only if this abstract notion is pointed out to them by a third party. But upon resuming the object level discussion, this knowledge that existed mere minutes ago often seems to once again become inaccessible. And with some people, they seem unable to accomplish (or at least admit) this at all, and even more curiously, seem strongly motivated to resist even discussing the idea that they may have made an error.

On one hand, you might just write this off as people "being people" who want to "win an argument" and that sort of thing, and surely that's a big part of it, but is that all there is to it? As a terrible analogy, consider how difficult it is to say, recite song lyrics while doing mildly complex math in your head - considering this, is it so hard to imagine that the mind may also be sub-optimal to an unknown degree when it comes to reckoning about the complex reality we live in at the object level (physical reality and events), while concurrently executing a "proper" abstract background process to do things like evaluate logical consistency and epistemic soundness of the primary object level processing, particularly on sensitive topics?

Just pondering the general notion, I tend to lean strongly towards the intuition that I'd be surprised if we could do this in a skillful and accurate manner, rather than being surprised that we cannot (which seems to be the overwhelmingly default opinion), and observations of internet discussions (regardless of community) tend to strongly support this theory as far as I can tell. Might this help explain how do so many people believe so many diametrically opposed things (increasingly, as the complexity of the world increases), while also having an extremely strong self-perception of objective correctness, even despite objective correctness often being literally impossible for the topic being discussed?

How this relates back to the original topic, is that a lot of people perceive a dramatic increase in the ability to think (in more ways than one) deeply about extremely complex topics while under the influence of psychedelic drugs, and fMRI tests are now starting to illustrate some changes at the neurological level that may plausibly explain why this is, at least in part. I think it's quite philosophically interesting to consider what the real-world consequences might be if the situation is that our perception of reality is not 100% consistent with actual reality - might this possibly result in sub-optimal decision making at both the individual and societal/national levels from time to time? And what if it's not off by just a little bit here and there, but by a lot all over the place? If so, might this perhaps help explain the counter-intuitive human behavior and general state of world affairs that I've been reading about on the internet lately?


> is it so hard to imagine that the mind may also be sub-optimal to an unknown degree when it comes to reckoning about the complex reality we live in at the object level (physical reality and events), while concurrently executing a "proper" abstract background process to do things like evaluate logical consistency and epistemic soundness of the primary object level processing, particularly on sensitive topics?

Agreed. To add to the point, one can't help but consider given what we know of the evolution of the tree of life of which we are a part the particular factors that do and do not lend themselves to survival. Namely, "reckoning about the complex reality we live in at the object level" is probably not a skill needed for survival in the way that developing hunting, language, and social skills probably were- not to mention "while concurrently executing a "proper" abstract background process to do things like evaluate logical consistency and epistemic soundness of the primary object level processing"

In my own experience, had my experience in a severely altered state of consciousness persisted for more than maybe a day or two, I would have needed a lot of assistance for basic survival things. In that state I perceived the world differently, if not more objectively, but was not well equipped for survival.

So, no, I see no reason to think the mind is well equipped for that kind of reasoning when evolution might select against that kind of adaptation.


Sounds fair... A german comic about what maybe become common - if 'training' is, what you were referencing to... with 'real-world-consequences' :

> //www.bildhost.com/images/2020/09/16/909.1_ABGEDUNKELTER-RAUM-KAMERAS...mar.15_FINAL.Mail.png ^^




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: