Hacker News new | past | comments | ask | show | jobs | submit login
Our Brains Are Not Multi-Threaded (calnewport.com)
273 points by janvdberg on Sept 13, 2019 | hide | past | favorite | 110 comments



The language here is weird. He's talking about doing more than one thing at a time. If you're talking about multi-threading in the context of the brain, the obvious comparison is with neural architecture. The brain does a massive amount of parallel processing and it very much is multi-threaded. But "multi-threading" sounds a bit cooler than "doing more than one thing at a time" I guess.


Yes, the title is rather poor. From reading the post, it seems what he really means is: "Mental Context Switches are Really Expensive". That's far less exciting, and probably also kind of obvious to most knowledge workers.


I think the title should be "Attention is not multi-threaded", which I believe is pretty well known in psychology and where the expensiveness of context switching stems from.


I wish more places and people understood this!

I am of the opinion that this for instance might be the biggest singular reason nodejs has been successful without people even realizing it. Since “full stack” development is so in demand/popular this really limits the context overhead of switching languages and semantics between front end and back end in such a way that it can keep productivity high.

(Yes nodejs has other strengths too and to be fair weaknesses but I wouldn’t discredit this as A much factor off hand at least)


> That's far less exciting, and probably also kind of obvious to most knowledge workers.

I think his point was that, obvious or not, many knowledge workers organize their work that way and this will actively cause psychological harm.

I wasn't fond of his technology metaphors either, but I think the point he wanted to make was a valid one.

Though, using the metaphors as a tool of argumentation might also make sense because do use terms like "multitasking" a lot when describing situations where you have to switch around between different activities a lot (and implicitly justifying it because the activity is also seen as simple and basic - because computers do it all the time)

So it might be useful to drill down on those metaphors and show how incorrect they actually are.

(If you want to nitpick on the metaphors, I think he had threads in computers come off too easily: As probably every kernel developer can tell you, OS context switch absolutely do involve a complicated machinery of instructions - and as every application developer knows, having multiple threads can absolutely lead to chaos if you have no strategy how to manage them)


Yes, and we're talking about conscious thought with context switching; not necessarily subconscious thought. You can drive or bicycle and actively be thinking about that conversation you had yesterday. Both are complex tasks.

The brains of mammals work very differently from the machines we create. We don't understand exactly how, but they are substantially more complex. Just look at how much work goes into writing a machine that can classify photos as cats or dogs; or software that can drive an autonomous vehicle.


Once a certain skill is fully internalized, using it may require no attention at all.

It's similar to growing a coprocessor designed to do this task, load on CPU is reduced and you may also get a small cache to buffer it for a while without losing context even if interrupted.

It has limitations, too complex tasks are hard for brain to make automatically facile.


Or, when you're trying to fall asleep, find yourself simultaneously mentally singing an earworm while thinking about code while trying not to think of some past trauma. It seems an everyday experience to have multiple streams of thought all at the foreground.


I have ADD, and I've been using a context switch for years to explain why it's so hard for me to change between two totally different tasks, but I can change pretty rapidly (with no penalty) between multiple similar tasks.

I've likened it has having to pack up one set of tools, before I can unpack the other one.


I find that when doing something less interesting I tend to switch back and forth between two tasks quite a lot while working - each switch acts to regain attention before it starts dropping off again. As long as nothing external forces me to context switch this both keeps me focused and drives my boss nuts.


There is a big difference between multitasking and parallel processing. As a former neuroscientist I get rankled when I read gross conflations like this, it doesn't help anyone understand how the brain actually works, nor does it effectively underscore the attentional drawbacks of trying to multitask.


The brain is more like SIMD than multicore processing, but I think a better analogy is a FPGA; it can do heterogenous instructions, but switching from one task to another is a crushingly expensive stop-the-world affair.


Possibly people assume that they are conscious of all the tasks that their brain does?

Can they walk and chew gum?


You are conscious of most things your brain does. Background tasks like walking and chewing gum tend to be very rote and decision light. Try and write while listening to someone speak, fold a paper airplane while navigating a new city, or program while watching a film.


Writing, listening, watching a film and navigating a new place all require attention as they're novel activities. But I could easily make a shit paper aeroplane while navigating, and I've coded through pure muscle memory plenty of times. Almost any activity can be internalised.


> I've coded through pure muscle memory plenty of times

How many lines of code though? :D


Couldn't tell you, wasn't paying attention ;)


I'd call it more similar to a computer system with multiple specialized processors working in parallel and limited buffers between them. Parallel system, really, not multithreaded.


All mostly stuck on a single bus and constantly blocking.


Fellow former neuroscientist here! The proliferation of pseudo-neuro explanations and analogies is infuriating. I envy people like theoretical physicists who work in a field where laypeople don't feel like they have an intuitive understanding of their subject area.


Maybe you haven't noticed how "quantum" is being thrown around these days, check https://en.wikipedia.org/wiki/Quantum_healing and https://en.wikipedia.org/wiki/Quantum_mysticism


I for one am a specialist on quantum mechanics since I read Steven Hawking. It's all bonkers, I can tell you that, but because of the uncertainty principle, I'm not really sure. That's OK because there is some parallel world in the multiverse in which it isn't. And please don't ask me to explain, I've read enough to know that by Hoffstadter's Incompleteness theorems I cannot prove it.


I know that dog created the universe with a pair of parentheses and after that it was pretty much backwards and uphill.

ok maybe not: https://xkcd.com/224/


Former physicist here! Hearing people talk about vibrations, entanglement, dimensions, quantum and other specific terminology in a more or less woowoo/spiritual context is just as infuriating.


Fiction writer here. If I had a penny for each time one of my readers asked, "So did they go to another dimension?"...


I really wish there were more books for laypeople by neuroscientists. I don't care much for theoretical physics personally, but a lot of people want to understand their own brains.


> lot of people want to understand their own brains.

You are what you think. I have no idea how or why... but I know this is so.


>You are what you think. I have no idea how or why... but I know this is so.

Given the evidence that conscious thought, self-awareness and "free will" are post-hoc effects, emerging after decisions are made by the brain at a subconscious level, it's more likely that that you are what thinks you.


I gather there is some debate in the field as to whether this is a true effect or a consequence of bad measurements. I do find it fascinating and await better evidence before making my mind up either way.

I do, however, appreciate your splendid and witty statement and shall reward you with an upvote :-)


> Given the evidence that conscious thought, self-awareness and "free will" are post-hoc effects, emerging after decisions

But I thought I was a racer from age 6... and now I are one.


I remember Richard Feynman (a theoretical physicist) explaining how you could deduce things about the human mind by observing behavior and self-reported internal states.


I think a better analogy is the way GPU calculations are done, as the calculations are done in parallel but the process is ultimately sequential: move data from CPU RAM to GPU RAM, calculate in parallel on GPU, move results data back to CPU RAM, operate on the results data on the CPU serially.


True, but from a big-picture view he might be doing a better job explaining the concepts to laypeople. And in that it might help them improve how they work / live :)


Sure, the layperson knows what multi-threading is.


The lay hacker.


But isn't the brain massively multitasking? It controls and keeps track of an awful lot of stuff. The "can't do multiple things [well] at once" really just applies to conscious efforts.


The keyword in the previous person's comment is attention. It's attention that is incapable of multi-tasking. It really just bounces from one thing to the other and attention is sort of what drives consciousness and directs it where to go. But the rest of the brain is doing a ton, such as memory encoding to both working and long-term memory. It's also doing things like language, motor, emotional processing, etc. as well as all the things more peripheral to consciousness like senses, time, vitals, etc.

If we look at the brain like a computer that interacts with the world as well as interacted with, I like to imagine that the brain is very complex and running many parallel processes. Like a computer, it has many parts of the interface to the world. Eyes are the screen, ears are the speakers, muscles are the mouse and keyboard, etc. I like to think of attention as the cursor; sort of a single process that directs what the user is interacting with.

I'm positive that is much too simple of an analogy because it still leaves the question: why can't consciousness control multiple attention cursors simultaneously? In many ways, it does in which consciousness can pass things off to automatic processing, like triggering muscle memory and other previously strengthened pathways. But, I think the why there's not multiple attention processes running is still a big question yet to be answered. I think neuroscience still has a lot to be understood before we'll be able to answer it sufficiently.


Maybe he meant the conscious mind we experience abstractly and is somehow hosted in / by the brain.

In which case, sure, it makes sense that each thought blocks the main thread and that's why thought-loops are particularly annoying.

In terms of I/O though, I certainly try to optimize the main thread with concurrent patterns when the input queue gets hit with high traffic. Even if I'm recurrently mulling over something for days, hoping for a break-through or insight or whatever, it's not like I'm incapacitated and can't perform / prioritize other thoughtful tasks on the event loop.


It is, but that's not what the article is about.


It would appear that the prefrontal cortex is much less parallel than the rest of the brain.

This is why we often get much better task performance when we allow our gut feelings (which come from subconscious parts of the brain) to inform our reasoning.

Deliberative reasoning in the human brain has a very limited capacity.

See: https://en.wikipedia.org/wiki/Working_memory and https://en.wikipedia.org/wiki/The_Magical_Number_Seven,_Plus...


Threading has a rather specific meaning in cpus, no?

Synapses are parallel, but so are various operations in a single threaded CPU, even just the ALU (carry look ahead addition e.g.). Vice versa, hyperthreading switches context when the single threaded cpu is stalled by memory look-ups.

While its notable how many ideas can converge, or how you remember that name out of the blue that you couldn't figure out minutes or days before is surely interesting. But attention is pretty much directed, so to speak. Just doing a different motion with each hand at the same time requires a certain ammount of orchestration. This requires planning and training.

The brain is a super scalar architecture, sure, but you cant do taxes with one hand, paint a picture with the other, and explain quick sort to your mom over the phone while you get your ... nippeles pinched (the classic Password Swordfish situation). Or more simply speaking, you can't be angry (at the taxes), elated (because of the forms and colors) and aroused (the nipple pinching throws various interrupts, whether arrousel or panic) at the same time while remaining pleasently calm towards your mum. Well, that does not sound like a good study design, but because of the health hazzard. Come on, if you stress a metaphor, I want too.


No metaphor is perfect - my claim is that this is quite a bad one compared to many others that he could have chosen.


we agree that the choice of words is bad, we agree that the metaphor is appealing, and we can probably agree that we don't know how the brain works.

Mixing behavioral ideas and whatever-else-accurately-describes knowledge of cpus (systemic?) is a bad idea, over all. That's not reductive and breaks the deductive argument.


Perhaps a better analogue would be a largely single- (or few-) threaded main processor with a truly astounding number of single-purpose coprocessors? I.e. physics units (motion vector approximation), graphics units (optical processing), management engines (parasympathetic nervous system) etc.


"In the context of the brain" what is the difference between multi-thread and multi-task? Can you give examples?


I can simultaneously walk and listen to a podcast; but when I hear two overlapping voices I can only parse[1] one of them at a time even though I can easily choose which one of the two to listen to[1].

[1] listen to? Parse? Hear? I’m not sure what the right word is in this context.


In the tech community, we talk a lot about not interrupting programmers (or other knowledge workers) because of the cost of so-called "context-switching." I think this post raises the very interesting point that, much of the time, the sources of these interruptions are not external, but internal. Stop interrupting yourself! You're a knowledge worker! His advice to cultivate systems that afford one focus and prevent context-switching is salient.


It's a cliche that IT as an industry is hostile to women, and it's also a cliche that women are good at multitasking.

How do we know that people's opinions on the difficulty of multitasking are representative and not biased?

I'm not even speculating on whether gender differences might be innate or trained.


Cal's essay is backed by empirics. I'd add that, for those like me who used to believe that women are better multitaskers, both men and women are equally bad at multitasking. Early studies would sometimes show a female advantage or a male advantage (not sure why the female advantage idea came to be popular), but they were based on humans performing tasks that didn't resemble the real world... or at least tasks from the world of business/office work.

A newer study, based on a real world task of meeting preparation, showed that both men and women were quite bad at multitasking: https://link.springer.com/article/10.1007%2Fs00426-018-1045-...


I feel like everyone in this thread arguing about multi-threading/hypterthreading/multi-core/multi-processor as an analogy is missing the poing. The post is about how you can focus your attention on a single task only (true multitasking doesn't exist) and that you should keep focus longer instead of switching context too often.


That's because the author is saying that the brain cannot perform parallel processing, which is clearly not true. You are making a distinction that the author does not.


Consciousness about what our brain does is single threaded. Everything else is massive parallel processing.


I would suggest that it's not "single threaded", either.

Comparing consciousness to how a CPU processes instructions is very poor analogy, because to this day, because the nature of consciousness is still a complete mystery to science.

I mean, we can think about stuff, but we don't think about them one after the other. We're thinking all of the time, but we tend to focus on what's immediately interesting to us. Consciousness is a pretty complex thing to unravel...


That’s closer, but I also think it’s wrong to look at the brain as a GPU of general mind stuff hooked up to a CPU of attention.

I think it’s likelier that each portion of the brain fires away at up to whatever its natural “throughput” is (some activities more or less parallel), and conscious awareness just happens to feel like a single global mutex.

I guess this is a game you can play where you keep refining the analogy until it’s too complex to help, but I don’t think the linked author’s analogy is good or useful. Better to just say we can’t be consciously aware of two things concurrently. It’s not a hard concept.


Right. Thank god I don't have to think about breathing or making my heart beat.


What you don't think happens on its own and what you think also happens on its own. The illusion is that thinking is you, identification with thinking. Once that is seen as an illusion you can observe yourself talking, thinking and verify all this happens on its own.


Effective task switching is a learned skill.. I often find that switching between 2 or 3 tasks a few times over the course of a day, helps me be more productive because blocking is less of an issue and sometimes blocking due to problems needing to be solved, can be iterated on subconsciously while you're forebrain is doing something else.. I think there are parallels with neural networks.. task switching helps make your network generalize better.


> In a world before multi-core processors, these threads weren’t actually running simultaneously, as the underlying processor could only execute one instruction at a time.

This is not true. Before multi-core processors there were many computers with multiple CPUs.

Is the author under 30 or something?


You don't think its true that before there were multi-core processors, processors could only execute one instruction at a time?

It doesn't say computers could only execute one instruction at a time. It says processors.

If you're going to do needlessly pedantic bikeshedding, at least get it right.


> You don't think its true that before there were multi-core processors, processors could only execute one instruction at a time?

It is not true.

Instruction-level parallelism existed before multi-core.

> If you're going to do needlessly pedantic bikeshedding, at least get it right.

Pot, meet kettle.


I don't care at all about this discussion! I was just trying to ding the commenter for going off on a tangent. Also, the person I'm replying to wasn't talking about instruction-level parallelism, they were misreading the article.


>I don't care at all about this discussion!

Then don't say shit like "at least get it right".


> In a world before multi-core processors, these threads weren’t actually running simultaneously

He's trying to describe the difference between multi-threading and multiprocessing to introduce the idea of task switching. But the bad history lesson is distracting.

There were long periods of history when most hardware vendors had at least one model with multiple processors, and there were some companies that specialized in them (Cray, Silicon Graphics, IBM, Fujitsu, Toshiba?), including ones built with x86 processors (Sequent, IBM).


My point is that interpreting this statement as a "history lesson" instead of a loose metaphor is quite silly.


The author has a PhD in computer science from MIT. You'd think he'd know computers.


Having a degree or special title doesn't mean you're any smarter than others and not having one doesn't mean you aren't.


Having a degree in computer science does mean you know more about computer science than others.


It ought to at least correlate with more knowledge, otherwise what’s the point?


To sell the product


Let me rephrase: otherwise what’s the point in getting a degree?


Whenever these comparisons of brains and computers come up, I am reminded of Freud's comparison between the brain and the steam engine.

People think about the world (and brain in this case) in terms of what they are familiar with. Today computers are ubiquitous and we try to see computers in nature. In 100 years it will be something different and we'll try to imagine the brain like that.


Lacking in evidence and uses vague terms like "psychic toll".


I think simple observation provides plenty of evidence. It's not a real scientific-method approach, but if you want to dismiss the relationship he's talking about, you need either to find a flaw in his reasoning or offer a better alternative explanation for the observation that when people try to "multitask", they usually do a shit job of it.


You can feel free to have whatever standard you want as to what makes up a worthwhile article. Someone's personal feelings on how the human mind functions, described with vague terms, is essentially worthless to me.

Given that the same complaint seems echoed in other comments, I'm not alone there.


My atttitude toward a Cal Newport article is that he's published multiple highly regarded books that I've read and found very insightful and applicable in practice. So when he writes something that's brief and does some handwaving, but fits within his overall worldview (like this article), I'm inclined to take it more seriously.

If you've never heard of Cal Newport before now, I can see reason for some doubt, but I also think you're maybe missing out on a lot within the field.


His blog articles were both more concise and more valuable than his bloated books.


Multi-threading is not a good analogy. If you think of the conscious brain as a hyper-visor which is switching execution context between multiple virtual machines, the analogy sticks slightly better. Running too many VMs (on what seems like a single core processor) will slow things down and is akin to context-switching between "neural activities". It's a whole host of things that the processor needs to setup before executing instructions in a sandbox. Switching between threads is cheap and the conscious brain does it all the time, even when you're focused on a single task.

But it's all so relative and comparing it to a "computer" and claiming it's not efficient is not fair. In fact, I'd argue that the human brain is a lot more efficient at handling tasks because not only is it executing instructions, it's constantly learning and programming the next set of instructions.


A better title would be "Our Brains Are Not Multi-core' - so multitasking, just like multi-threading on a single core CPU, will incur a penalty of context switching, cache misses, and so on.


Looking at split-brain patients, it does seem like each hemisphere is a "core" which communicate via the corpus callosum. So you could say that our brains are multi-core.


"it’s often much easier for the programmer to write independent threads, each dedicated to its own part of the larger system."

oh, if only it were true...


Im one of these people who reaches for threads to solve many a task, and that statement resonates for me. I've never understood why threads are considered so difficult, other than people not getting the message to sparingly share data other than via message-passing.

Any time you need to do something asynchronously, things get trickier, but whether you keep it single threaded or not barely makes any difference. Race conditions exist in single threaded code too as soon as it is interacting with a network or the OS or anything that takes time and you don't want to block.

It's not threads that are hard, it's the explosion of the state space of your program when you commit to accepting further user/network input whilst some task is ongoing, whether you implement that with threads or not.


This was the point I stopped reading and came to HN to make this exact comment. Beat me to it!


Yeah, that part really breaks down the credibility of the article.


If you want to read something interesting on how thinking works I suggest 'Thinking, Fast and Slow' by Daniel Kahneman.


I enjoyed TFS but tend to agree with this review:

> Thinking Fast and Slow is my runner-up for “book most overrated by investors” (with Klarman’s Margin of Safety the champion.) People who pitch this book as “required reading” simply haven’t read broadly enough about cognitive biases: while the content is certainly useful and I don’t take anything away from Kahneman as a researcher, his writing/communication/worldview leave much to be desired, and the same lessons can be learned far more effectively and enjoyably via a variety of other books (many of which are suggested below).

http://www.askeladdencapital.com/daniel-kahnemans-thinking-f...


Interestingly,

https://retractionwatch.com/2017/02/20/placed-much-faith-und...

The author thinks that he relied too much on weak studies for the thesis of the book.


AKA our brains are hyperthreaded. The observation doesn't seem that salient.

That said, I believe the article to be incorrect. Our brain does do various sorts of parallel processing (visual cortex, keeping your heart beating, maintaining sense of balance, etc). It's just that the cores happen to be more specialized.


You can add many learned things, like driving, riding a horse, playing the piano, and attacking math problems under time pressure to the sort that apparently involve parallel processing.


> Imagine, for example, you’re creating a basic game. You might have one thread dedicated to updating the graphics on the screen

Has the author has never actually written a game? The majority of games are single threaded. You calculate the positions, moves, strategy of the AI, etc. etc. and then eventually render the frame. For years, certain releases of Doom and Quake were the benchmarks for games because they were some of the first to use multiple threads and could scale to multiple CPUs/processors.

Sure, modern games tent to be able to use more cores and have some independent threads, but the majority of the work typically still happens in the main event loop.


Listen to Terence McKenna's recordings where he describes tripping experience, or somebody doing DMT, or Michael Stevens doing ayahuasca. Maybe all these observations are anecdotal and not scientific, but it feels like ideas and thoughts live their own lives inside neocortex.

Our brain's potential is harnessed with reptile and mammal tasks which are more asynchronous than parallel. Even if we could overclock ourselves into massive parallel thinking it would have ended up with oxygen or glucose deficit. And mind overheating, whether cpu is made from silicon or grey matter, there's a limit for FLOPS per joule.


I’ve generally found that a context switch from one programming task to another can be pretty easy (even if totally unrelated), but switching to something non-programming related is often much more painful.


The brain is the hardware, consciousness is experienced in the mind, the mind is the faculty for processing thoughts it is also the same faculty we use to perceive the self/ego and our reality with help from sensed data. The highest level of our consciousness the mind appears single threaded because the perceived self/ego is attached to the minds thread, multi threads will be like multi self. But we can be conscious of the fact that we are thinking, that is itself a sort of thread running, and we can perceive that one also, and the other...


Sometimes when I'm meditating I experience a mini ego-loss. I come out of it feeling like I'm merely a collection of running processes over which I have little to no control. I feel like I get a brief glimpse of 'multithreads,' and it's somewhat emotionally unpleasant.


Yes exactly. My first comment was based on what I experience when meditating.

> I feel like I get a brief glimpse of 'multithreads,' and it's somewhat emotionally unpleasant.

I think it helps not to identify the self as the thoughts/processes of the mind, the self is the observer/awareness


This article is nonsense and I don't think any background research was done before writing it. It takes almost no effort to come up with an example of parallel processing in the brain. Pat your head and rub your tummy at the same time. How many times have you been driving and talking and listening to the radio at the same time?

He must mean that conscious attention is not multi-threaded, and that is obviously true because the nature of paying attention is that you are focusing on a single thing.


As I read it, it seems to be saying that our brains are multi-threaded, but not multi-core or multi-processor.

The lower priority 'threads' which are not the intended focus still do take cycles in brief moments when the main task is idle only to be preemptively interrupted at the cost of many context switches and adding very little value. The same happens with hardware CPUs which is why coroutines are much more effective when the number of tasks >> number of processing units.


Also we do have multiprocessing abilities, just not multiple 'foreground' conscious ones.


We sort of do, but at best it's not more efficient, at worst it's much slower.


I've never experienced being aware of two unrelated things at the same time so I definitely have a single-core consciousness.


> This is all to say that the closer I look at the evidence regarding how our brains function, the more I’m convinced that we’re designed to be single-threaded, working on things one at a time, waiting to reach a natural stopping point before moving on to what’s next.

So it's another hypothesis, parading around as a common claim.


My brain is. I am simultaneously looking at the screen while writing this comment by tapping my fingers too.


I'm not sure this is a great example of memorization or multithreading but playing chess blindfolded against three opponents, is out there: https://youtu.be/xmXwdoRG43U


That's actually pretty easy for any experienced chess player. The world record is simultaneously playing 45+ opponents blindfolded, which is actually incredibly hard.


Multiplexing in the brain is apparently achieved by maintaining a high and low (theta) peak frequency -- when at a ratio of the golden mean, each separate period of high activity doesn't interfere with the other. Makes sense, when you think about brainwave bands being harmonics (frequency doublings), as that is what permits frequency coupling-- whereas irrational ratios prevent coupling.

Pletzer, B., Kerschbaum, H., & Klimesch, W. (2010). When frequencies never synchronize: the golden mean and the resting EEG. Brain research, 1335, 91-102.


Btw, I know that anything involving golden mean smells like crank science. Note that Wolfgang Klimesch is field topping in EEG research citations...


A computer is a bad metaphor for the brain, there's no reason to strain the metaphor even further by dragging threading into it.


There’s a special irony to being distracted enough by a Cal Newport article to feel the urge to comment on it!


not in the time slice way we programmer usually think when taking about multi threading, but how many times you let a problem be only too have an epiphany some time later? background thinking is very real and albeit not a fair allocation system it is, in a sense, threading.


Yet! Give it time, we'll add a peripheral port soon enough


More complex than any man made machine




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: