Hacker News new | past | comments | ask | show | jobs | submit login
“You want to know something about how bullshit insane our brains are?” (2018) (twitter.com/foone)
476 points by notRobot on June 21, 2020 | hide | past | favorite | 200 comments



So my visual system takes an image and projects it back through time to when I started moving my eyes to present a consistent image of the world? Or, alternatively, I have to be perceiving the past as time travel is impossible, I am living in the past and my visual system can fill in the holes and make it seem like I'm always perceiving the present.

Why this is presented as bullshit insane I have no idea. It seems like an amazingly brilliant solution to me, much better than the 'engineering' solution suggested where your vision flashes black every time you move your eyes. Also worth mentioning this weirdness is repeated in human consciousness all the time. Like the theory (tested as far as we can) that tall people perceiving the world further in the past so they can reconsile nerve impulses from their feet as happening at the same time as visual stimulus. We've known about phenomenon like this for a long time as well as other deeper revelations like, what is sometimes called, the illusion of free will. And yet we constantly act surprised that our conscious experience of the world doesn't correlate to what we find out is 'actually' happening.

So we are flawed meat bags with insane, non sensical brain mechanics, if only our biological solutions were as well thought out as VR eh?

How glad am I that the incredible complexity and subtly of evolution created my visual system and not Valve or Sony.


This is something the philosopher Daniel Dennett has written about at length. The nature of our consciousness is not as it appears. (Here's his TED talk [0].)

As you say, blinking rarely makes its way into conscious thought, despite that the visual input was briefly suspended. (I suppose a similar thing happens with smell. How often do you notice that you don't smell much when you breathe out?) Neither do we notice that our various senses have different latencies, reporting the same event at different times.

As I understand it, the blind spot in our vision was only discovered relatively recently. For countless millennia, mankind wasn't even aware of it, and not for lack of curiosity. I suspect the same goes for our colour perception.

Senses can even blur together - the input from, say, your sight, can affect how you perceive taste, or texture.

As you say, this kind of 'smearing' is a brilliant feature of our minds, it's not an ugly hack at all. That we aren't able to intuit the precise nature of how we perceive things, is a small price to pay, both in evolutionary terms and in terms of the elegance of the system, to my mind.

On a less substantive note: reading lengthy writeups on Twitter is singularly painful. I do wish people would make the leap to the blog-post format for this kind of thing.

[0] https://www.youtube.com/watch?v=fjbWr3ODbAo


> As you say, blinking rarely makes its way into conscious thought, despite that the visual input was briefly suspended. (I suppose a similar thing happens with smell. How often do you notice that you don't smell much when you breathe out?) Neither do we notice that our various senses have different latencies, reporting the same event at different times.

Interesting anecdote: I've been through a fair amount of training in perception and conscious proprioception in the military. The end result of all of this training is that I am very often actively paying attention to my body, especially my vision, hearing, and touch. I've noticed that when I blink, I'm consciously aware of what has changed when I finish blinking. I'm aware of how my body moves during this time, if my sense of touch is telling me something is different than before my eyes closed. I can pick things up that I can't see without fumbling for them. It's really interesting how effectively we can mentally model the world around us, incorporate different types of senses to update this global model.

I imagine that professional athletes, race car drivers, Martial artists, etc... develop similar levels of proprioception and modeling.


Certainly to a lesser degree than it sounds like you’ve developed in the military, but just riding a motorcycle regularly gets you to a similar place. In order to survive on a bike, you need to stop assuming that other people see you. That leads to a much greater awareness of what’s happening on the road and what’s coming up in the near future, and the motion of your own body in space.


@Foone has ADHD and finds it impossible to write blog posts:

https://twitter.com/Foone/status/1161786083782250496


You mentioned color perception and its probable recent understanding. This makes me wonder if philosophers with unique perceptual differences might come to different conclusions than their neurotypical counterparts.

For example, would a color blind philosopher understand the world differently? How about a synaesthete? What about a philosopher diagnosed with sociopathy? Would their experience change their rational understanding of the world?


Thanks for writing this. While the tweets have great comedic effect, they are just really stupid.

You could very easily write a whole piece about how elegant and awe inspiringly complex and effective they are on the exact same subject.

Except it wouldn’t get as many likes and retweets, of course.

Sort of made me sad just thinking about that. Twitter really does encourage the worst forms of conversation - outrage, sarcasm, cheapness, drunk humor dominates.


I didn't take it like that at all and I feel you are over-reacting. To me it seems that the author (in his own tongue-in-cheek way) is awed by these clever solutions that natural selection has found.


I feel like these kinds of "X scientific phenomenon is incredible magical bullshit and here's why!" type deals are overdone and feel kinda forced.

But eh, I'm not the one who took their time to write this all out in a presentable way so whatever the author wants


I don't understand your criticism. Foone (a rather famous user of Twitter with 62k followers who posts enormously interesting content in my opinion) is successfully engaging his audience in a casual style that they appreciate. How is this "really stupid"? Because it doesn't live up to your expectations for the form that such content should take?


Kardashians have many more followers and are successfully engaging their audiences. Their posts must be really smart.


They would get the same audience if they were an average-looking woman and decided to do porn for free.

Nassim Haramein - British leftist newager who claims to have invented a new theory of physics implying all particles are actually black holes and who’s quite popular in North London hippie circles has 350K followers on Facebook.

I think these are good baselines of what it takes to get 62k followers.


Here's a link to the "quack"'s paper: https://www.researchgate.net/publication/302941651_Quantum_G...

Science establishment knows all, and is never wrong, especially once it excommunicates a heretic. Science heretics are always bad and wrong. Science establishment people are always right. Trust science establishment. Give science establishment money. Science good. Quack bad. Give money. No money to quack. All to science. Science good.

Did we mention science right and quack wrong?


> Sort of made me sad just thinking about that. Twitter really does encourage the worst forms of conversation - outrage, sarcasm, cheapness, drunk humor dominates.

I've never understood why, in the Eternal September, anyone would expect the average late-night conversation on the internet to be any different from the average late-night conversations at the local pub. Sure, you might find that rare table of PhD students/young engineers/residents/academics and teachers/etc. having a genuinely interesting conversation as they transition from work to relaxation after a long night at the lab/hospital/grading. But mostly it's drunk idiots relishing in base pleasures. And after the second drink or so those engaging conversations will devolve.

I don't think Twitter is encouraging anything in particular here. The same thing happened with IRC channels, forums, etc. No point in getting angry or sad about the Eternal September; it was always inevitable.


You don't think the nature of a tweet, the very limitation placed on the number of characters, encourages a certain type of communication/interaction?

The platform not only discourages long form dialogue, which is especially useful for communicating complex topics, it explicitly prohibits it.


> You don't think the nature of a tweet, the very limitation placed on the number of characters, encourages a certain type of communication/interaction?

Sure, but I don't think that type of communication/interaction causes bad conversations. The conversation was going to be a shit show regardless. Go join any Mome's or Political group on Facebook. The conversation isn't exactly elevated.

> The platform not only discourages long form dialogue, which is especially useful for communicating complex topics, it explicitly prohibits it.

Not really. "1/N"


The pub analogy or even the Facebook group isn't very good.

Tweeting involves getting your short thoughts evaluated and rated by strangers at scale! You're also regularly aware of the properties required to (potentially) go viral.

It is far more difficult, by design, to encourage better conversations there.


Right, and I think most conversations in real-time (like in a pub) involve a decent amount of back-and-forth; most don't consist of people trading long blog-post-length monologues. (Though I suppose there's a certain sort of person who pontificates while drunk.)

While I'm not a big fan of Twitter, "the character limit encourages a weird conversational style" is not on my list of grievances at all.


Few years ago I used to have great conversations on Twitter but now there are too many ppl and many do it full time like a job or something, conditioned by the reward mechanisms to get those Likes and build social media influence. That changes what kind of conversations are possible.

I see Twitter as mostly a broadcast medium where you go to get the latest info on some event.

Conversations the kind that feel natural and human dont have a weird distracting real time ticker above my head scoring every dumb thing that flits through my mind.


"many do it full time like a job or something, conditioned by the reward mechanisms to get those Likes and build social media influence."

Without going into the backstory, there was a period of time where I had to follow about 100 very specific people. I also happen to know how successful these people were financially.

Those who were the most successful financially tweeted the least and those who were the least successful financially tweeted the most. Then there was those who didn't tweet at all because they had no financial incentive to and their egos weren't contingent upon it.


These things can be a great entry point to dispel naive unexamined homunculus ideas and to dive into some philosophy and neuroscience.


Sure, may have been a bit overly pessimistic. But tbh you could still be super funny and not so “omg were dumb”.

The 2010s “lol look at this dumb shit” attitude reminds me of the 90s “edgy ironic” attitude. It’s lazy, just following the trend that further reduces our ability to continue and have a cool discussion. You can be funny without needing to hit every dopamine receptor. But hey, maybe that’s just what Twitter is for - letting off some steam.

I only felt sad because I saw discussions here the other day about humans being sphagetti code and this sort of absolutist/negative/ironic postmodern attitude just kind of sucks - it actually shuts down conversation (note you are literally just defending it instead of expanding into, say, some philosophy). You can be totally tongue in cheek, funny, and even self-deprecating without being just wrong and super absolutist at once, it’s just not as easy and probably wins a few less internet points.

If anything I’m advocating for a humor that complements truth.


The spaghetti code stuff may point to the right direction though, depending on one's current opinion/understanding. As I always remark, writing a comment or blog post can only point in a direction, as in "please shift your opinion following this arrow", but not a destination as in "please arrive at this target".

If one watches a lot of brain documentaries on pop sci TV channels, the thing they present looks like this magnificent beautiful 3D model with bluish and pinkish stuff and light pulses go around as in a hyperfuturistic awesome thing while the camera is circling around with beautiful intriguing music in the background. The reality is a squishy gobblegook of kludges that gets its job done remarkably well but through some remarkably strange ways. It is a lot of spaghetti, no clear modularity, lots of stuff performing multiple functions and doing things in a roundabout way.

I first read about this contrast in David Linden's really enjoyable "The Accidental Mind: How Brain Evolution Has Given Us Love, Memory, Dreams, and God".

But if you already think that everything is rotten and humans are crap and biased beyond imagination, nothing has any value etc. etc. then perhaps you'd benefit from the opposite direction idea, about how really well it works and how much extra stuff the mind does that is not strictly mundane and everyday tasks.


I think you probably should be pointed in the direction of your last statement, but hey, that’s just my direction.


It reads like an article written entirely with Buzzfeed headlines.


It's two sides of the same coin. Anything heavily-optimized for something other than simplicity ends up maddeningly complex. It doesn't make me sad that people prefer the more entertaining perspective on the same information.


I wondered after a few posts why it was even on Twitter, and if it had been an article/blog post there would definitely have been less motivation for the author to play to the gallery.

> how bullshit insane our brains are

> but OH NO

> this shit works

> your freaking visual system just lied to you about HOW LONG TIME IS

> we're apparently computers programmed by batshit insane drunkards in Visual Basic 5

> your brain has EVEN MORE UGLY HACKS

Interminable shtick that just makes subject matter pointlessly longer and more of a chore to read.


I'm ambivalent; on the one hand this could be a long-form article but on the other I prefer a ten tweet summary to a new Yorker article where they'll start by discussing a neuroscientist's dog walk for 500 words before getting to the topic.


Foone (the author) finds it impossible to write blog posts. So, they write Twitter threads instead in order to get something written. They’ve explained this in more detail here: https://twitter.com/Foone/status/1066547670477488128


This comes up any time @foone’s content gets posted.

TLDR; that’s authors choice both for stylistic reasons and personal. When the choice is between “don’t blog at all” and “post on twitter” because that’s what fits his mind process, I would rather read about it on twitter than not at all.

Judging by the amount of discussion here, it seems like this was thought provoking for a number of people despite the format.

I find twitter threads frustrating to follow also, but you could always... skip clicking the link if you feel so strongly about it.

As an aside, criticizing the form of content rather than the substance is not of particular interest in general, to be honest. It comes off as “I didn’t really have anything to say about the subject, but at least I can comment on perceived flaws in the style of presentation”.


The critique is actually that this stupid YouTuber attention-desperate style leads to Reddit level discussion. Especially because he just frames the whole process as being dumb rather than... interesting. So no it’s not just a style critique, he’s just making wild unfounded claims while trying to frame them as obvious truths.

If you like that style, then fine, why don’t you just not reply to this valid sub-thread? I don’t see you adding any to the discussion. But the discussion of the degradation of our ability to communicate due to social media incentivizing stupid clickbait is super important.


Apropos of nothing, thanks for your work on Recoil, it’s awesome :)

I don’t think the discussion on this post here is reddit-level - for example, I found the top level comment on the role of dreams, and the link to the commenter’s paper on psyarxiv super interesting. As I said in my comment, I too find the twitter thread format a bit frustrating to read.

What I was trying to say is that the author (Foone) has previously posted about this - for them, the choice really is between “blog this way or not at all” due to the way their mind works.

Of course the fact that this topic comes up every time is indicative, but I think what it indicates is that Twitter UX is terrible, not that the author is a bad person for choosing to share in this format.


Thanks! I haven’t done much at all, not sure if you’re involved with it but if so thanks right back, I do love Recoil and its potential.

I definitely should aim my criticism at the incentives more than the people, that’s a good point. I specifically like HN because it seems to have figured out pretty nice incentives (of course the lack of scale is the key).


I'd take a twitter thread over medium any day anyway.


I didn't say anything even remotely like they should not blog at all. Someone posted their opinion, some people disagreed with it - and escaped your criticism of "I don't have anything to say about the subject" - and I agreed with it.

If this happens any time this author is posted there's obviously something to the criticism of the style.

As for your aside, "if you don't like it, don't read" is also not a particularly interesting contribution, to be honest.


I didn’t mean to imply that you said they should not blog at all. I should have been clearer - this is something the author themself said in the past (linked in a sibling reply). For them, due to the way their mind works, that is the choice - they have tried long-form blogging and found themselves incapable of ever completing a post. I can relate to that experience in some ways.

I am not saying Twitter is a great platform for this type of thing - clearly it’s not.


Fair enough, and thanks. If I'd known the criticism was old news I wouldn't have added to it, or at worst done so ironically. I just saw someone posting similar thoughts to my own and basically let them know they weren't alone.

I now know what to expect if I do read more in future, but regardless of my opinion of the style, I definitely rather people post/publish in any form rather than not post at all. Anything that can inform people about something they weren't previously aware of is a good thing.


All our senses, without exception, can only detect the past. Every sensor (natural or man-made, photodiode or eye) deals only with the past.

Say a novice practices to catch a ball on a windless day. He gets better. What exactly is he getting better at? He is getting better at guessing the trajectory of the ball based on past 3D positions of the ball.

This guesswork has its limits. He learns the limits on a windy day. He gets better with more practice on a windy day.

This new knowledge too has its limits. He learns that on a windy day at the beach.

The brain tries to guess the future based on a sequence of events. This is why detaching from senses is an experience like no other. Meditation, irrespective of the modality, is a way to detach from senses. We do not live in the past.

If our brains were not capable of holding memories and only of detecting one event at a time, two cars moving at vastly different speeds would look separated in space, but nothing more than that. I think, our senses, our ability to hold on to memories (past sensor data) of events and our ability to compare two past events combined makes the world animated. This also seems to give birth to the sense of passage of time. Which also could mean that time does not exist?


Time exists, as is evident by the fact events happen. We just don't know what it is.


> much better than the 'engineering' solution suggested where your vision flashes black every time you move your eyes

I don't see a real problem with this. Imagine a video camera that inserts a black frame every other frame (in the sensor data), and a transcriber that just doesn't bother including the black frames in the recorded data. Why would the black flashes from the sensors be a problem?


One of the best arguments against intelligent design is that you cannot praise a design without having some idea of a good design; i.e. if they could have done literally anything else and get the same praise.

But that's completely unrelated....

How about a visual system that did not need to take liberties with the perception of time?


Random aside: once you've had a few physics or engineering classes on waves, filters, etc, it feels really impressive that the cochlea performs something similar to a wavelet transform on incoming sound waves: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3280891/


> Like the theory (tested as far as we can) that tall people perceiving the world further in the past [...]

Can you offer a source? Asking for a tall friend.


I don't sorry, read it somewhere but no idea where!


What you might not know is that your emotions can do this too (create feelings which feel real but don't match with the reality of a situation, with the purpose of fulfilling an underlying need of the brain - typically protection).

This is at the heart of mental health - the issue being that imagined feeling patterns are formed during infancy. During that period you need to maintain relationships with your caregivers in order to survive (so your feelings organise around achieving good caregiver conditions). However when you are in the wider world as an adolescent/adult things are very different (1. don't need your caregivers to survive and 2. the rest of the world doesn't act the same way that your caregivers do) and that's when things get bad-mental-healthy. Your brain is imagining feelings to satisfy a different environment to the one you are in.

Even more amazingly the brain knows it does this and has a built in correction mechanism - dreams. They don't auto correct, but provide instructions (if we listened). This isn't the Freudian-everything-is-a-penis/vagina vibe, but a much more practical route to using dreams to diagnose false feelings. I wrote a paper on this here: https://psyarxiv.com/k6trz

When you have spotted a false feeling it's very easy to get rid of it. You have to catch yourself repeating the pattern and break it. Do that a few times and your brain reconsolidates and no longer feels the inaccurate feeling as it understands that the survival function of it is no longer required.


Interesting read!

My oversimplification of the paper: give yourself real-life exposure to situations you fear in dreams. (You'll probably respond well IRL, and start to break down your anxiety.)

Some key quotes

“… dreaming results in the mental generation of situations where we should behave unanxiously, or without avoidance … the function of dreams is to highlight erroneous fear responses … anxieties are the over generalisation of the fear response, stemming from a real fear memory, resulting in unnecessarily avoidant behaviour. These are what dreams bring attention to.”

“in my own therapy, where I started to use [dreams] to identify which fear I needed to expose myself to next, I found that, as one dampens the highlighted fear in real life, in-dream behaviour becomes increasingly un-avoidant (i.e., in line with what the situation demands). … This process continues until the fear is no longer present, at which point dreams highlighting this anxiety no longer occur.”

“… typically diagnosed symptoms of social anxiety (e.g., avoidance of social situations, low self - esteem) are all sensible anxieties to hold in order to avoid potential conflict situations; avoiding interactions eliminates the risk of upsetting others and having low self-esteem results in a default conflict-avoiding position of appeasing others. … These dreams propose a route out of social anxiety which focuses on becoming comfortable upsetting others for the benefit of the self, rather than disproving seemingly relevant maladaptive beliefs (e.g., people don’t like me).”


Hey - thank you for the great summary :)


The first time I did MDMA that was driven home to me in an undeniable fashion. Nothing about my life situation changed but this cloud of anxiety and fear that I had been living in since I was a child lifted like clouds breaking in a thunderstorm. I felt genuinely happy and safe for the first time in my life and it made me realize how arbitrary my emotional interpretation of the world was most most of the time. I realized I could choose to be happy, choose to be friendly, and people would reciprocate it.

Now, MDMA was bullshit in its own way, of course, but I’ll forever be grateful for the realization it gave me. It changed my life.


> Now, MDMA was bullshit in its own way, of course

I really don't get why you said that (I use the stuff myself). Do you really think that, or is that partly an externally-acquired anti-drugs message, perhaps picked up from school in drug 'education' lessons, or am I being uncharitable?


I DJ'd and promoted raves for close to 10 years. I'm extremely familiar with it's good and bad sides. The happiness that MDMA makes you feel is as illusory as any other emotion is.


> as illusory as any other emotion is.

My emotions are as real as my bones. If you think emotions are an illusion, you should spend a few months with severe clinical depression (I had much more than that). You'll soon change your mind.


> You have to catch yourself repeating the pattern and break it

How do you recommend doing that?


A therapist is very useful. It is difficult to see yourself clearly from the inside out. It's easier for someone from the outside to help you spot patterns, then you will gain some level of awareness of how they arise in your life (because you will be discussing your life with the therapist).

The paper I link to on dreams should help too. Dreams are sort of an internal mini-therapist, but again, it is easier with an outside perspective.


The first step is to consciously notice the pattern - at first it may be difficult and you would notice it only after the fact, but with some introspection and training you can generally also learn to notice and acknowledge it while you're in it. When you do catch it with your "rational brain part" it's generally not that difficult to break it.

That's a big part of what cognitive behavioral therapy does. Also, mindfulness meditation is one way for doing this that helps some people with the "catch yourself doing that pattern" part.


Journaling or therapy can be effective strategies. Both give you the outside perspective needed to spot patterns.


When I first learned about this, especially the "Because your brain recognizes it's moving and adjusts what you see to make sure it sees the 'right' thing" part, it explained a phenomenon that really confused me as a child:

I would sometimes "catch" our bathroom clock (analog, with second hand) going backwards for a second. But no matter how long I stared at it, it wouldn't happen again. Only when I turned my attention to it would it sometimes seem to go backwards for the first second.

With the "your brain extrapolates the movement backwards" this makes perfect sense if the second hand will vibrate a bit on each tick. Apparently, if I focused on the clock while the second hand was vibrating for a fraction of a second after moving, it might be moving ever so slightly backwards at that moment, which is extrapolated to a full second of backward movement. In the end it's just another optical illusion.


> I would sometimes "catch" our bathroom clock going backwards for a second.

Woah, I remember this too. It's a memory I had long forgotten. Every once in a while if you glance at an analogue clock that doesn't have a smooth second hand, it looks like the first time the second hand moves is backwards.


If you have a clock with a smoothly sweeping second hand, and try to trace it's path ahead by 5 or so seconds (in other words, when it's at 0, you're looking at 5, at 1 second you are looking at 6, etc) the time will speed up and slow down in bizarre ways, and sometimes even look like the second hand completely stops occasionally.

It's a combined effect of saccades and your brain drifting in and out of focus.


While it could be vibration, it doesn't seem very likely you could see the vibration and motion-estimate it backwards.

I seem to recall a similar confusion a couple of times. It's more likely just a bad coverup and/or a timestamp confusion in asynchronous processing.


Have you ever noticed car wheels looking as if they are spinning backwards instead of forwards?


This is due to the rotation rate of the structural elements of the wheels being just above the flicker fusion threshold frequency, around 60 Hz [1]. As the wheels spin faster, they just look like a blur.

[1] https://en.m.wikipedia.org/wiki/Flicker_fusion_threshold


Thanks. Very interesting.


It's also fascinating when the other end of the visual system (the brain) has issues.

My aunt had a very small stroke in her visual cortex. Incredibly it only impacted the visual processing in the upper right quadrant of the right eye. Everything else was fine.

She was telling me that the affected part of her field of vision wasn't black (as you'd expect). She described it as "empty". It just wasn't there. There were no good words to describe how that looked visually.

Interestingly as her brain recovered somewhat, the "empty" area got a bit smaller, then her brain started to do "error correction" and to her, the "empty" part is gone, but it doesn't provide any visual information. Similar to the tweets, it's just filled with "background".


10+ years back a handful of times I suffered from (what I think were, but you’ve got me thinking) migraines that resulted in localized loss of visual perception. In my case the spot was right in the center of my field of vision.

The loss manifested itself at first as trippy vibrating colors, which soon changed to non-black nothing.

After that I would look straight at things not only to not see them, but to not even realize I am not seeing something.



I've had these several times myself. For me, they start as a small "blind spot" at the center of my field of view, which was really scary, particularly if I was trying to read or do screen-work at the time. After the first one, the appearance of the rainbow zig-zags was actually comforting, because that confirmed what it was. Since I was effectively incapacitated for the duration, I'd just lie down and enjoy the light show.

It's been years since I had an occurrence, but the timing was pretty reliable.

~30 minutes: slowly growing blind spot at the center of my field of vision.

~15 minutes: rainbow zig-zags in a slowly expanding circle from the blind spot until it encircled my whole field of view, after which it faded out and I could focus pretty normally again. No headache during or after.

After multiple rounds, I figured out the, or a, trigger: absinthe. Drinking absinthe reliably brought one on ~18 hours later. Only ever had one that I couldn't tie to absinthe. Stopped drinking absinthe years ago, and the scintillating scotoma stopped as well.


Sounds about right, good to rule out any sort of stroke… Though honestly that pulsating phase the article highlights was the least of my problems. For 10 minutes or so while it’s there you can at least tell something’s wrong, for the rest of the day it’s a treacherous blind spot that’s easy to forget about.


That is very common. It's the classic migraine aura.


We also cannot see what's behind us, yet it doesn't feel black, just not there.


The exercise that made me perceive, what I think, is the closest to what's meant by that what's described as "empty":

1st step) Imagine you have eyes in the back of your head.

2nd step) Concentrate on trying to see what's behind you.


I’ve heard blind people (that weren’t born blind) describe it a similar way. “I don’t see black, I see with my eyes what you see with your elbow”


My dad had a small stroke that left him unable to see in one eye. But being neurological, this wasn't exactly like losing sight.

If someone sitting next to him spoke, and he wasn't facing them with his remaining working eye, he would be taken by surprise, and would not always understand who was talking to him. Instead of turning his head, he would frequently ask, "Who's that? Where are you?" It wasn't just that he couldn't see the person, it was almost as if the person wasn't there. He frequently described how the blind side of his vision wasn't black, just "not there".

Another oddity was that he didn't notice the loss of vision until maybe a couple of days after it happened.


The same thing happened to my mom after her stroke.

There wasn't a black spot in her field of vision, it was a kind of indefinite background that didn't stand out, but didn't have any information. Her best explanation is that it was like your peripheral vision, you can "see" things to the side, but you're not getting 100% of the detail you'd get if it was in front.

But yes, she said the same thing - it freaked her out when she would move her eye slightly and something that was hidden by the blind spot popped up.


> it seriously shows you the image at the new point, but time-shifts it backwards so that it seems like you were seeing it the whole time your eyes were moving... you can see this effect happen if you watch an analog clock with a second hand. Look away (with just your eyes, not your head), then look back to the second hand. It'll seem like it takes longer than a second to move, then resumes moving as normal.

Is there a citation for this? I can't find any paper on it, the Wikipedia articles on "sacaddic masking" and "transsaccadic memory" don't mention the phenomenon at all, and I've been trying it with my own clock (where the second hand moves in second increments, not continuously) and it simply doesn't happen for me at all.

I can't figure out why your brain would "rewrite past memory" instead of simply persisting the old image either.

If anyone could point me to an actual name for this phenomenon, or a paper proving it, I'd super appreciate it, because it seems so implausible that I'd love to understand it more.

EDIT: found the actual term for it -- "chronostasis" -- and the memory phenomenon is "neural antedating" or "neural backdating". [1] And while it appears to be a reported phenomenon, it appears to be understandably difficult to verify experimentally, since it is an inherently subjective phenomenon and experiments involve potential sources of bias that could alternately explain it.

[1] https://en.wikipedia.org/wiki/Chronostasis


"I can't figure out why your brain would "rewrite past memory" instead of simply persisting the old image either."

There isn't an image to persist. The "image" you see is an active construction of your brain working on incomplete and inaccurate sensory data. As a hypothesis, your visual system doesn't have the memory to "persist" the sensory data, the data during the saccade is not meaningful, and your brain, while constructing your own personal chronological sequence after the fact simply uses what it has on hand.


> it simply doesn't happen for me at all.

FWIW, I have noticed the effect many times, in particular with the second hand of analogue clocks.


Yes, but never while you're listening--it's not a hack by your vision system, it's your perception system. You're basically hallucinating all the time, mostly to synthesize what's 'out there'.


Even before I knew this was a thing, I know I've looked at my watch and for a second being confused about why it seemed broken when it was working just minutes ago, simply because the second hand didn't appear to move in time.


I had assumed that this was part of my watches' mechanisms that self-calibrated but it never occurred to me that I just happened to notice this self-calibration so often!

It's interesting that I casually came up with a complex explanation and didn't even try to critically examine whether it was plausible or not.


>And while it appears to be a reported phenomenon,

I believe you're misreading that Wikipedia section. The phenomena exists, and the mentioned aspects all play a role in causing it.

It is difficult to explain which part of the process causes the time distortion, and the measurements of how distorted things can be are influenced by the testing.


Studying computer vision and specifically the intersection of visual computing systems and human vision in the context of Augmented Reality really cemented to me how much "bubblegum and duct tape" human systems really are - and are totally amazing.

Namely how flexible/fluid adaptive the brain is. The eye is pretty "dumb" but the processing pathway of the optic nerve through the visual cortex (which is hilariously on the complete other side of the brain) is a marvel of processing.


Here is a Thread Reader view of this content.

https://threadreaderapp.com/thread/1014267515696922624.html


Goodness that's long for a BLOG post! Much less twitter. Thanks for the link!


The wikipedia experiment is much easier to do and more pronounced. You'll need a mirror and a reasonably good phone camera to take video. I just tried it and it's amazing.

> This can easily be duplicated by looking into a mirror, and looking from one eye to another.

But you can't see your eyes move in the mirror. They seem strangely very inert. If you play the video afterward, you'll see your eyes moving!

> https://en.wikipedia.org/wiki/Saccadic_masking


It's very easy to find the blind spot with the mouse cursor. Close one eye, and with the other look at the opposite side of the screen. Move the cursor around on the side corresponding to the open eye. You'll plainly see the cursor disappearing and reappearing—meanwhile, your brain pretends that it knows static content that's in the spot. Most probably, it will even interpolate movement for videos—as it wont to do.


Do same thing with your favourite paint app, turn mouse into red pencil and click if pointer is invisible. Eventually you've mapped your blind spot.


if ur on a mac shake the cursor really fast and the arrow grows massive. never noticed that before. I was using my curosor to check the blindspot on my eye and was wiggling it in and out at the edge and it freaked me right out. it went in small, disappeared and came out 20x larger. I was like fuck me these eye tricks are mental.


Moreover, if you shake it really fast for a bit of time, you'll get an easter egg.


As someone with a PhD in neuroscience I like the sentiment of this post but it’s off the mark in some key ways.

From a high level @foone is right, our eyes and visual system are nothing like a camera. Our visual system constructs an internal 3D model of the world, our “perception”, given observations in the form of the electrical activity in our photoreceptor (rod and cone) cells. It tries to do this in a way that balances energy use and accuracy of the model. This simple concept can actually be used to explain a lot things that may seem like ‘hacks’ as @foone put it. One example: Saying Saccades are a problem is misleading, they likely evolved to give us higher acuity (resolution) vision without having to pay the energy cost of a larger fovea [1, 2, 3]. They also likely allow us to see color more accurately outside of our fovea.

I also really don’t buy this time-shifting idea, although other neuroscientists would probably disagree. As @foone said our eyes are not cameras, our perception is our brain hallucinating an internal 3D model of the world. It knows what data to incorporate into the model and what not to, i.e. a blurry saccade is not a good representation of the world. If you want to read about how this can be framed with bayesian statistics you can read one of my favorite papers from my advisor [4].

[1] https://en.wikipedia.org/wiki/Saccade#Function

[2] Ratnam, Kavitha, et al. "Benefits of retinal image motion at the limits of spatial vision." Journal of vision 17.1 (2017): 30-30.

[3] Anderson, Alexander G., et al. "A neural model of high-acuity vision in the presence of fixational eye movements." 2016 50th Asilomar Conference on Signals, Systems and Computers. IEEE, 2016.

[4] Olshausen, Bruno A. "27. Perception as an Inference Problem." The cognitive neurosciences (2014): 295-304. https://pdfs.semanticscholar.org/bb00/42b5e48feff89a95182c63...


I have a condition called nystagmus[1] which makes my eyes saccade constantly. It mostly happens when I look to the left or right, with what's called a "null point" in the middle. Having a null point in the middle is a lucky break.

It doesn't affect my day-to-day, which I think is surprising. In fact I have very good hand-eye coordination and visual perception, which may be a side-effect of constantly having to track moving objects.

It's also a symptom of being drunk, or a brain injury. Medical students are always very concerned and ask whether I've hit my head, and I often get breathalysed when I'm pulled over.

[1] https://www.aoa.org/patients-and-public/eye-and-vision-probl....


I get nystagmus from time to time (maybe once a year) because of positional vertigo, but I have since learned to "fix" it by applying the Epley manoeuvre myself. I guess yours is not treatable though.


This is an interesting write-up but doesn't go far enough. Yes you're not watching an HD pixel based video when seeing.

But the phrasing of "your visual system shows you X" is still off. The visual system is you, the eyes and the retina's neurons are you, etc. There is no "you" hidden somewhere deep in the brain that watches what the brain shows it, like a homunculus.


Biologically yes, you are all your body cells. You are your brain as much as you are your intestines.

But psychologically it depends on what you identify with. Most people identify (wrongly) with their consciousness.


There is the possibility to observe your thoughts, which strongly indicates two "you".


There _is_ the possibility that this is "just" memorizing / remembrance / memory of the just-passed (now-past) instant. Meaning as you think about "current" thought-X, the current instant is now thought-Y (thinking about X) and X is no longer "your current thought". Hm. Impossible to measure or decide because we can't ever possibly define "how long" is "now".


Found John Searle!


Those of us who have had profound psychedelic trips and experienced time loops might beg to differ.


The understanding that there is no "you" watching a movie constructed by "your visual system" is actually a great explanation for these kinds of experiences. In order for you to experience a psychedelic hallucination of a face, there doesn't actually need to be a face constructed anywhere in your brain; all that needs to happen is for you to _believe_ that you saw a face. If something causes your "face detector" system to light up, you will think you saw a face whether or not a face was there. This brings the possibility of psychedelic experiences within the fathomable; it's not like you have some massive GPU in there constructing complex animations that the rest of you is watching.


Do you ever see a cat or a bird move in the corner of your eye, but when you look closer it was just a leaf or a shadow or something? I think you see that cat just as much as you see anything at all. The visual processing just got a little too carried away in a way that was easily noticed, just like the seconds hand on the clock.


All the time. In that moment the leaf that used to be a cat also instantly changes to having been a leaf all along.

Our experience is like a dream vaguely guided by reality.


One of their examples about the balls appearing bigger when needing to hit then reminded me of something from a YouTube video [0] about our senses lying to us. The example brought up in the video is how a hill appears steeper to us if we have a heavy backpack on!

Overall though, those were some cool new examples of eye limitations I hadn't heard of before, like the accumulation of laser light damage.

And yeah similar to other commenters, some of these could just as easily be explained as amazing adaptations required to work with the limited resources available in addition to a description as ugly hacks. I guess the important part is just to hopefully become more aware of our limitations with the goal that that helps us maybe get a little better understanding of potential difficulties we all face.

[0] https://youtu.be/BMvgOjGPXyw


> how a hill appears steeper to us if we have a heavy backpack on!

I've reverse hacked this: When I'm tired and need to walk up a hill, I look straight down at the ground such that it appears horizontal, this makes it easier to walk.


Come to think of it, that's kinda sorta what I do to! Stare at the ground right ahead and just plow ahead. I always put it down to just saying well, doesn't really matter what's further ahead too much, just follow the path and we'll get there eventually. I like the idea of it as a reverse hack though. Gotta look up occasionally to enjoy/watch out for stuff though.


In general, thinking about how we see using metaphors drawn from photographic media (chemical or digital) will lead to deeply incorrect understandings.

We do not see images. Our brain does not have a frame buffer. Our minds construct the visual experiences we have of the world around us. They construct them from reality and we have developed technology that can make recognizable facsimiles of how we see the world, but they have very little to do with each other.


A similar (and similarly good) thread on sensory and motor systems: https://twitter.com/analogist_net/status/1014397203450744832


It's not only vision. When you're sleeping and there's a sound that will wake you up, you can have a dream fragment which would have to know the future to compose itself this way.

I don't know if it has a name but I've noticed this many times.


I also noticed that and couldn't find any name for this phenomenon, seems like there are somethings on the internet about "Sensory Incorporation in Dreams" but I failed to find anything related to this specific behavior of dreaming with something that led to the noise.

https://biology.stackexchange.com/questions/56798/what-is-th...


I thought I was going crazy when I first noticed the chronostasis effect on my watch (that is where the ticking second hand seems to stick in place when you first glance at it). Years later by chance I came across a paper explaining the mechanism.


I would often "see" the second hand tick, but not move forward by a second. More like a "flick" than a tick.


Same. I've always wondered this, but never enough to google. Amazing the things I learn just clicking random links here.


I started reading and immediately thought of Blindsight. The tweet thread mentions it later, make sure to check it out if you like hard sci-fi, it's one of the best.


It was my first though as well. Idea of aliens, which can just "skip" the sight of humans is pretty cool. The same goes with concept of consciousness as a parasitic/optional part of inteligent species.

Blindsight is one of the best, 10/10 on Mohs Scale of sci-fi hardness. I love the description of scientific concepts behind sci-fi story at the end. Such a shame, there is so little books with that kind of deep research.

One of the best written, first contact novels out there. I have the luck of hearing Watts lecture on one of the Cons. He is even better in person.

If you are from Canada, it is good idea to check, where he is speaking.

Authors blog, filled with pop-scientific concepts and ideas [1], is well worth it.

[1] https://www.rifters.com/crawl/


Also note that the author has made the whole book available to read free online and via ebook:

https://www.rifters.com/real/Blindsight.htm


The sequel, Echopraxia, is also excellent. Not quite as mind-bending, but has all of the first book's existential creepiness.


> It'll seem like it takes longer than a second to move, then resumes moving as normal.

shit I knew it was real! adolescent me noticed this and had a hard time to explain or figure out information about it (this was about the excite era)


Ohhhhhhhhh! So that's why whenever I stop and look up at a traffic light with a timer, the first second always seem longer than the subsequent ones.

This is so cool. It's one of those mundane things that you think only occur to you, when in reality it's way more common and intriguing. Like those eye floaters.


Why would you assume that your experience isn't common to others with the same evolutionary history and environment?


it's an effect that's very hard to describe - john: "hey jim did you notice that last second was longer" it's very personal, if jim was not looking or already looking at the timer from before john's assertion would just sound absurd and it also require being attentive enough to the clock and the sense of time in general.

and personally I kinda had a general feeling that could be at best described as 'the first second of a timer always look longer than the others' which roughly matches what magnio said, and that was it, it's not something I'd widely share unprompted, people already think I'm enough of a nerd as it stands.


I asked my parents once about the eye floaters but they didn't seem to get what it is, so I just assumed it's only me.

Granted, as a kid I had this egotistical sense of self, to which I attributed a number of misconceptions about lots of things.


You might be interested in the concept of qualia [0]

Vsauce did a good video on it, "Is Your Red The Same as My Red?" [1]

0: https://en.wikipedia.org/wiki/Qualia

1: https://www.youtube.com/watch?v=evQsOFQju08


So do we have examples of people for whom these hacks are not functioning quite as well, and what the consequences of that are? Because a lot of these visual hacks he describes remind me of issues people with Auditory Processing Disorder have to deal with, except that it's vision instead of audio.

[0] https://en.wikipedia.org/wiki/Auditory_processing_disorder


I have one for you: the vestibular system works in concert with the eyes for "image stabilization" purposes, so when you are eg. walking around, all that up-down head motion is canceled out. Visual vertigo occurs when the two systems fall out of sync for some reason.

https://www.sciencedirect.com/science/article/pii/S030698771...


The same when you play a videogame with a heavy “view bobbing”. Someone decided that it would be more realistic if your camera shook by 15 degrees both axis at every move (looking at you, cod:mw), but it only makes a scene feel sick and you can’t turn it off.


Another insane property is a type of blindsight in people who are cortically blind, i.e., their eyes are intact, but have damage to the visual cortex so they cannot form or process images.

However, parts of our visual system other than forming images are handled in other parts of the brain. For example, managing the tracking of objects by the eye - it's a completely separate group of control neurons (vs the visual cortex) that drives where the eye looks (and it's actually different for vertical vs horizontal tracking).

So, it turns out that if you present images with, say, vertical or horizontal stripes and ask a person with cortical blindness (developed after they could see) person which it is, they'll tell you of course that they can't see & don't know. However, make it a forced-choice question where they must guess one or the other, and they'll get a high percentage right, far greater than chance. The information is exfiltrating from the centers controlling the eye motion even though there is no cortex to form an image.


A little tangential: If you hard lock your eyes, say, all the way to the right and rotate your head slowly to the left, you can defeat the saccade mechanism and get a smooth pan, complete with motion blur.


This is basically because (one of?) the only times our eyes don’t do saccades is when they’re tracking motion. In other words, you can’t move your eyes smoothly unless you’re following something (like prey or a finger). In this case, you’re tracking the boundary of your periphery!


It seems to me it is more because your eyes can't move if they are locked all the way to one side.


https://en.wikipedia.org/wiki/Smooth_pursuit

" Smooth pursuit eye movements allow the eyes to closely follow a moving object. It is one of two ways that visual animals can voluntarily shift gaze, the other being saccadic eye movements. "


Ah, now I understand. Thanks. Makes perfect sense.


> There was an experiment back in 1890 where someone wore glasses made with mirrors in them to flip their vision. After about 8 days, they could see just fine with them on.

Schience youtuber Kurtis Baute did this as well: his video "I Spent 7 Days Upside-Down" documents it... a very fun watch!

https://www.youtube.com/watch?v=2HaXUCQKBjs


Most informative takeaway for me: an amazing experiment to "see" saccades: (1) mirror, vs (2) iphone selfie camera, vs (3) macbook photobooth.

Scanning my own face, for each of these 3, i tried to see my own saccade, and (1) I am blind to it in the mirror; (2) barely see it on the iphone, and (3) very visibly see it on the macbook.


I think a similar effect was described by Daniel C. Dennet. Imagine a row of lights next to each other. The leftmost one blinks, then the one to the right, then the one to the right, creating an effect of a smoothly moving light dot from left to right (similar to how animation works).

What does your brain see in the period between the blinks? Because it perceives the light as a smooth motion, and not as discrete jumps, it must "see" a virtual non-existent light in between two real lights. But if everything in your brain happens chronologically, how would it know if the light to the right would light up or not? It must have "backfilled" that information into your memory/perception after seeing the right light blink.


It probably extrapolates, much like how when you stare at those videos of a moving background and look away, your environment appears to shift in reverse of the motion you were observing.


I'm a bit suspicious after spending 5 minutes trying to replicate the frozen clock phenomenon and having no success. Often, the second hand ticks within a very short window (<150ms) of looking at the clock, consistent with what I would expect.

When you shift your focus to look at an analog clock, the time to the next tick of the seconds hand should follow a uniform distribution between 0 and 1 seconds if there are no visual bugs. It seems likely that people experience some confirmation bias, where the delays above 0.9s stand out more in their memory.

I really don't buy the claim that there's a 500ms delay in perception here. At the very least, this should be bounded by the time it takes to move your eyes (~100ms).


I too was unable to replicate that experiment. I took it a step further by not only having an analog stop watch, but also having other things close by that were constantly moving. Even though my timing resulted in variable durations to the next tick, some longer than others, I did not notice any “paused” effect from other things.


Do it with a fan and you'll see the fan blades going a little slower for few milliseconds when you look at them. Adjust the fan speed for even better results.


It's a bad one since a human is very bad at estimating very short (less than 1 sec) time frames. Try the mirror experiment, it's straightforward.


Does the same chronostasis effect with the second hand on the clock work if you can hear the clock tick?

I'm having a hard time visualizing this. If it still happens, would you sense a longer time between clicks? Or does the effect go away?


> you can see this effect happen if you watch an analog clock with a second hand. Look away (with just your eyes, not your head), then look back to the second hand. It'll seem like it takes longer than a second to move, then resumes moving as normal.

Woah, woah, woah. I'd noticed this before when sitting around waiting for something (reception rooms, etc.) but never thought much of it. Thought it was just my mind playing tricks. I am so happy to know this is a consistent phenomenon for everybody, and there is a concrete reason for it. The human brain is facinating.


This is great write-up. I love learning things about perception and how the manipulates data behind the scenes. Are there any good layman books about the this?


Think about streaming a live video from Youtube inside your browser. There's a lot of complicated stuff going on underneath: layers upon layers of compression, error correction, buffering, etc. Many ugly, crazy hacks to squeeze the signal into a system not built for it. The end result is a lossy approximation of the original and definitely not in real time.

The visual system is a lot more like videos than people think.


I was unfamiliar with the Ted Turner black & white movie colorization controversy (circa 1980s), interesting cultural conflict:

https://www.latimes.com/archives/la-xpm-1986-10-23-ca-6941-s...

http://archive.is/LAkOO


I don’t think these are hacks. They aren’t bubble gum and duck tape. They are well tested solutions for current physical limitations.


Today, I noticed that when I open my eyes, the vision quickly fill from the perifer vision to the center. Like a shrinking black bubble, disappearing in the middle. It was easier to see it in a dimly lit room after staying there for a while.


Hmm, how does saccadic masking jive with the fact that if you "roll" your eyes and keep rolling them, you still see an image? It's blurry, but it's there, and you're never holding your eyes in one position.


Why oh why do people abuse twitter to post entire blog posts. Do they also drink from plates, and shower in the sink? Write kernels in Javascript? smh


A twitter thread to answer your question https://twitter.com/Foone/status/1066547670477488128


Because you get much more engagement on Twitter than you would with a blog post.


I don't really get the beef with this. Twitter's UI for reading these tweet-streams is pretty decent these days.


See that's where

gorkish 1 hour ago | parent |on "I don't really get the beef with this..."

you and I have very markedly

gorkish 1 hour ago | parent |on "I don't really get the beef with this..."

very different ideas about what

gorkish 1 hour ago | parent |on "I don't really get the beef with this..."

"pretty decent" means

gorkish 1 hour ago | parent |on "I don't really get the beef with this..."

because I actually think this kind of

gorkish 1 hour ago | parent |on "I don't really get the beef with this..."

nonsense is really quite bullshit.


I'm going to have to disagree. Because the link only shows the first tweet, it's really confusing if I don't remember to click the small "show thread" link. As far as I know there's no way to share a link to the entire thread


I agree it's not ideal, but there's more interaction going on.

Even if you add comments on your blog you will never get the same thing you get on twitter.

That said, I don't think Twitter threads works for all sorts of written content either.


This author has written about this in the past. He says that if he’s faced with an open blog post window he can never write/complete/publish anything but the Twitter format makes it easy for him to do so.


I suspect there's a study to be done here. In what way is the open blog post window intimidating? Or discouraging?

I for one often feel that if I have something long to write, I need to open a proper word processor rather than paste it into a nondescript box, less I somehow lose what I was writing. I also feel that in some of those tiny boxes, I can't adaquately arrange my thoughts, as I can't see the structure of what I wrote before adequately.

Perhaps the twitter format alleviates the writer's tension by forcing them to "chunkify" (packetize?) their writing into small portions, with the satisfaction of being able to hit "enter" and calling it done after every little one.

Of course, I think paragraphs can accomplish this as well.

But coming back to my original statement, I wonder how the common tools of today makes us look differently at how we compose and structure our thoughts. If I think of the way I'd communicate to colleagues over chat, or email, a collaboration tool, or heck, on HN or Reddit, and compare that to how I'd go about writing a blog post or paper, they're quite two different things.


No. In his post about it he says he has very bad ADHD. He's having treatment but it's not working well at all. He can't write a blog post because his brain gets tied up in knots thinking about writing and editing a long post, but he can write short scribblings.


Perhaps he should write (or contract someone else to write) an application to allow him to create a private twitter-like feed, then reformat it into a normal blog post.

If it were not in such an awful format, I might have read further. I was interested enough to see what the post was about, but the format was enough to make me think "I'm not that interested".


Perhaps you should write that app, if you're so concerned with how they publish their posts?


I suppose he could do that, but the real question is, what is the value to him of having you as an additional reader? Probably too low to justify changing his tools.


I guess that depends on how many other people like me have the same reaction, and how many readers he wants to actually have.

Could go either way. I'm not saying he should change the way he does things just to please me, and I might not be a reader he cares about, and that's fine.

I also understand that it could be the act of publishing a piece at a time, and having it immediately be publicly available, that pushes him to actually complete the post (adding just enough pressure on himself, or whatever).

I just wonder if the format choice might have a larger impact than he thinks.

My suggestion was intended to be constructive. I was thinking perhaps he could still get the effect of the twitter-like way of doing things, while ending up with a final format that would appeal to a broader audience.

Of course this is all just speculation on my part. Likely he will not even see what I suggested, not that I really expected him to.


Sure, just realize that the value to having an extra reader might be zero, and the value of having you specifically (or any particularly scoped reader) could actually be negative.


Not to be snarky but this comment and it's followups will show up in 9 out of 10 Foone twitter threads that get posted to hn. It's idiosyncratic maybe, but this is just the way he delivers his content.

I'm mostly glad it's not Medium or some other paywalled shit.


They're addicted to attention. Twitter is engineered to encourage this.


Here's another one (apologies if I missed it below) - when you are in an accident it seems like everything is in slow motion - no, its your memory is more complete, at least you think it is, but its not really. An experiment was performed to validate this effect, to see if you actually can see things faster when you're in an accident :- https://www.livescience.com/2117-time-slow-emergencies.html


Saccades are a great demonstration of the idea that consciousness is actually very different to what we intuitively imagine it to be.


This make no sense to me. If I turn to look at speeding cars they don't appear frozen for some brief moment.


WAIT! Is this the reason why the 48 fps Hobbit movie looked so weird with jumping frames and whatnot?


High frame rates are associated with TV shows and sitcoms: https://en.wikipedia.org/wiki/Motion_interpolation#Soap_oper...

Also, those 48 frames need to be converted to 60 frames and as a result, some frames will be shown for a disproportionate amount of time and/or created by mixing two adjacent frames. The article is about interlaced modes, but I think it also applies to progressive scan: https://en.wikipedia.org/wiki/Three-two_pull_down


No I mean it looked weird in CINEMA where the frame rate was exactly 48 fps. And by weird I specifically mean the jumpiness observed occasionally in certain scenes. As if the frames were suddenly being played by much more that 48 fps. I doubt that this was a technical glitch but rather something happening in my brain.


I don’t seem to have the same experience of saccades as described here, I see blur but mostly ignore it. When I look to the right and left of my screen, I can see the light blur, and when looking across a projector beam I (and my friends) can note the red blue and green colours as separate, but only during the saccades. Anyone else have a similar experience?


We should fix it by adapting cephalopods' eyes[1] and throwing out our own garbage.

[1] https://en.m.wikipedia.org/wiki/Cephalopod_eye


Isn't this exactly what requestAnimationFrame does?


This is just so hard to read. I can't concentrate on what the author is trying to say because I'm constantly distracted by the way they're trying to say it, meaning all the irrelevant exclamations with which the author is peppering their tweets.

I have to wonder: what is gained by mixing up so much noise with an interesting message? Isn't there enough useful information in the subject of saccades, without the tone (which I suppose appeals to some)?

It's like when you're trying to hold a conversation with someone that disagrees with you on the internets and they constantly try to troll you, starting their comments with sentences like "no, this is wrong, you're an idiot and you don't know what you're talking about" before telling you why they disagree, or what they mean etc.


Why do people sprinkle curse words pointlessly over otherwise useful information?


I can’t speak for anyone else, but I like to sprinkle in swear words as a way of “grounding” the subject.

A lot of my friends will bother me for help with compsci or math stuff for some reason, and I usually try to help. Math and programming are hard, especially to someone who doesn’t already know it, and people can quickly get frustrated as a result of it being confusing. Throwing in a swear can be useful to signal to the person to relax, and to tell them that it’s ok to think of this stuff in goofy terms, and in my experience it can help lower the tension.

It’s not the best tool for every scenario, I don’t use that tactic when teaching kids, but I do think there’s utility in it.


I think it's a fun literary style sometimes. Useful information doesn't have to be presented in some specific way to be useful, and in this case I think it's additive.

I agree that it can be overdone, but it's pretty lightly sprinkled here. But it's the same as any other useful literary device, to answer your question. To make things more interesting or different.


My guess would be it's a type of signalling similar to "how do you do fellow kids?" a la Steve Buschemi; in other words, for relatability. To the layperson, throwing in a curse word signals to them that it's not going to be some dry text that they'll be reading, but rather something more palatable.

Another commenter mentioned shock value, which doesn't seem far off. A little taste of shock is what jolts the reader's attention.

With that said, it should be used sparingly (if at all), otherwise the message loses eloquence and the intended expression is diminished by vulgarity. Someone also once said that using curse words makes people dumber and lazier because they end up using curse words in place of more descriptive words, which I tend to agree with.


This is something that bothers me too. When I read/hear someone using curse words needlessly I take it as a sign of insecurity about one's eloquency.


Shock value.


[flagged]


> What kind of brain damage does one have to suffer to keep choosing Twitter for long-form writing?

Particularly Foone has ADHD, if I'm not mistaking the specifics. And posted about it being the reason for the choice of the choppy-form (just so it's clear I'm not spilling beans).

But frankly, Twitter is sorta alright because authors' efforts still tend to be less long-winded when paragraphs of two hundred words don't quite work. I'm really tired of verbose yada-yada from HN, so finding myself actually attracted to links to Twitter—even though it's a hot mess in most other aspects.


And has explicitly stated that the piece-by-piece authoring and publishing of twitter threads works better for them.


Sure. Fine. But the argument still stands that Twitter is awful for long form writing(asterisk), and I doubt everyone else who makes mile long twitter threads have ADHD.

Also, ADHD is not "brain damage".

(asterisk)and everything else, in my opinion

edit: I'm too tired to fight hn markdown right now.


I mean, piece by piece authoring could still be done on a blog. Which seems better because you're able to edit it afterwards (if you want)


The brain damage in question is the lack of creativity and laziness. In my country it is common (e.g. among journalists) to:

A. Post on Facebook first (which allows longer text) and then use Twitter to re-post content.

B. Write in Word/Notepad/whatever, take a screenshot and post that.

I find both methods much cleaner and easier to follow than a series of tweets.


Replace twitter.com with nitter.net: https://nitter.net/Foone/status/1014267515696922624


Thank you! That's wonderful!


https://threadreaderapp.com/thread/1014267515696922624.html

I hate Twitter threads too. I think this site does a nice job of pulling them together, and looks cleaner than nitter since it doesn't make each tweet look separate.


The author does this all the time. I follow him on twitter because I love the weird stuff he finds/builds but I don't know why he posts them on twitter instead of a place more suited to multi-sentence works.


whats this thing with shredding a blogpost into countless tweets?


I think batshit would be the proper type of shit for the title.


I like the word "bullshit" for this, even though batshit is the more standard expression. Because in a sense, this is about our brain bullshitting us.


> the answer is simple! your brain has EVEN MORE UGLY HACKS on top of this to avoid you seeing that.

Only computer programmers tend to describe biological systems in deprecating terms like this, calling them ugly hacks, klugy, etc. As if we know best what's elegant and what's klugy. I wonder if perhaps we do this because we would never have the practical intelligence come up with a solution like that. It's not a great look for us.


I dont buy that. Vision developed for different purpose than to provide full HD sharp vision across entire FOV. It has great pattern recognition and peripheral motion detection. It heals itself. It has low power consumption. It does not overheat and can work in many environments, even under water.

Modern competition like Go PRO require too much baby sitting, and can hardly work for 2 hours without maintenance.


You don't buy what exactly? This is just science on how eyes work and some of the interesting things that happen, the author isn't being prescriptive about what eyes are for or how they should be used. I think what you said is entirely consistent with the Twitter thread.


I think OP does not buy that camera system is better than eyes or that our vision system is bullshit insane.


I mean, I don't think the point is that cameras a better overall (apart from the advantages mentioned above, our eyes still have a ridiculously good dynamic range compared to cameras), just that if you want a predictable mapping from stimulus to result, our vision system (which is eyes + visual cortex) has a ton of quirks and biases which affect perception in ways which are really non-obvious and can really obscure what's actually happening (hence 'insane'), while cameras are far more predictable and regular, which can matter if you want to be precise about what's going on.


> low power consumption

Hmm, well the visual cortex makes up almost a third of our entire brain, and the brain uses ~20% of our daily energy. I’d hardly call that “low power”! Though granted I don’t imagine you can break a sweat just from thinking.


> I don’t imagine you can break a sweat just from thinking

I heard that people who were asked to do sorting tasks, i.e. just separate big oranges from small ones, got tired as heck, comparably to physical workers. Then again, dunno about sweat.


Yep, mental exhaustion is undoubtedly a thing! I'd love to see how heart rates change with something like test question difficulty. Since sweat is generally a product of heat and occurs when your heart works harder during exercise, I wonder if the heart also works harder to pump the brain with more oxygen to enable it to... think harder? Obviously just musing, am no physiology expert.


I would definitely call that low power. Our entire brains only use 20 watts of power, compare that to the 85,000 Watts IBM's Watson uses.


What is an alternative for example for driving? Computer and lidar that needs a kilowatt just to do single task, not universal?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: