Hacker News new | past | comments | ask | show | jobs | submit login

Your comment reminded me of a clever and well-written short story called "Understand" by Ted Chiang.

> We could fuzz it and see if we can crash a brain.

Sadly, this we already know. Torture, fear, depression, regret; we have a wide selection to choose from if we want to "crash a brain".




I don't mean it quite like that.

Think for instance of a song that got stuck in your head. It probably hits some parts of it just right. What if we could fine tune that? What if we take a brain simulator, a synthesizer, and write a GA that keeps on trying to create a sound that hits some maximum?

It's possible that we could make something that would get it stuck in your head, or tune it until it's almost a drug in musical form.


What you're talking about is getting pretty close to a Basilisk - https://en.wikipedia.org/wiki/David_Langford#Basilisks


BLIT is available online http://www.infinityplus.co.uk/stories/blit.htm it's a fun short read.


"fun"...


I don't recall what basilisks are, only that I burned it from memory for probably a good reason.


I've no experience with it, but I imagine it's like heroine or DMT or something like that. Wouldn't that come close to something that "hits some maximum"?


Brains still operate as brains after severe trauma. They just don't necessarily operate well as humans in a society. Though I guess you could say making a brain destroy itself (suicide) is "crashing it" too


> Brains still operate as brains after severe trauma

Well, except when they don't, but since a brain functioning as a brain is part of the operating requirements for the body that lets the brain operate at all, when they don't, they ultimately fail entirely in short order.

So, assuming that a brain generally operates as a brain after severe trauma is a pretty serious case of survivorship bias.


Brains do have plenty of “backup systems”. Like emotions, stress etc do have an effect on the whole body, spinal reflexes are not affected as much and you will likely not fall over just because of extreme stress or whatever. There are similarly many many more “primitive” systems in place, like you will continue to take breaths even at “shutdown”.


My first thought was that this reminded me of an epileptic seizure brought on by "fuzzing" (sensory overload)


I think that's pretty plausible.


Ted Chiang’s “life cycle of software objects” is also similar to the OP. Basically about how an AI (not strictly an upload) would probably be subjected to all sorts of horrible shit if it was widely available.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: