Hacker News new | past | comments | ask | show | jobs | submit login
Everything You Thought You Knew About Learning Is Wrong (wired.com)
217 points by rbanffy on Jan 30, 2012 | hide | past | favorite | 48 comments



This is a terrible title to an ok article. Basically it is about spaced repetition and the contextual interference effect.

The argument for the latter being if you focus on one thing your brain will rely more on short term memory in working things out. If you vary randomly within the domain, constant switching of context forces more reliance on long term memory structures. So you learn slower per session but show better generalization and retention of material in the long term. Its good if you are just learning but not useful if you are trying to figure something out.

The mechanisms are little understood but what's really cool is that a similar effect is seen in machine learning. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.117...

http://en.wikipedia.org/wiki/Varied_practice

http://jn.physiology.org/content/106/5/2632

http://www.cogsci.northwestern.edu/cogsci2004/ma/ma239.pdf


I have to agree with you, I generally find those link-bite "everything you blabla is wrong" titles rather offensive. It a) does not say anything about the content of the article, b) actually insults the reader in a blatant way and c) is a very cheap, offensive and apparently "hip" way of ruthlessly longing for click-throughs. I do not understand how titles like those keep getting up-votes and I feel quite sorry for authors writing good/ok articles and their bosses choosing horrible headings. I am sorry for the "off-topic" rant but it reached a tipping point for me.


Well, clearly titles like this one sell a lot more..


Not understanding the mechanism makes this kind of research very haphazard. For example, his tennis example reflects something that many players already believe, namely that active interference from the conscious mind detracts from learning. (This dates at least back to the 1970s and Timothy Gallwey's "Inner Game of Tennis*.) What strikes me is that the author's mixture of activities would certainly serve the same goal of diverting conscious attention away from the skill being learned. Furthermore, constant conscious attention and reliance on short-term memory would always go hand-in-hand. So the article's observation about mixing activities, your observation about short-term memory, and the widespread folklore about avoiding conscious interference with learning could all have the same mechanism. Until we find the mechanism, we're stuck trying to recreate superficial aspects of these scenarios without understanding why they sometimes produce results. It's the very definition of cargo cult learning.


Ah, is that a reference to Feynman? You criticism is valid but the work being done in the area is certainly not of the phlogiston type. In fact, your criticism applies to much of modern medicine where the systems are so complicated and little understood that we can only catalogue observations, look for statistical correlations, construct simple models, study animal models and back hypothesis with little more than logic and statistical arguments/tests. But things are improving thanks to computers, the ability to solve complex non-linear systems numerically is allowing more mathematical models to be developed for fundamentals. But a unified theory of biological systems is a very long ways away. As more fundamental understanding is developed, the best that can be done is inductive inference. As long as there is good design of experiments, meaningful data being collected and studied and peer review, it is not cargo cult science.

As for your criticism. Spaced repetition is well backed by experiments and centuries of observation and fits with our growing understanding of memory - see Long term potentiation and PKMZeta.

Contextual interference is more tricky as it is not fully understood but it is also long studied. Unlike cargo cult, various hypotheses are being offered, tested and refined. For example, the short-term long term memory observation is based on the observation that in tests of recall, stroke patients with impaired short term memory (regardless of method) and those using the varied examples method performed better after 24h+ than those using blocked practice with functional working memory. The hypothesis being that the constructs active in long term recall are more exercised using the varied examples method, leading to better long term performance. Another hypothesis is that increasing the complexity by interleaving examples increases the challenge level and hence engagement and motivational/reward circuits parts of the brain important for learning. In addition, interleaving examples forces a more general model to be learned, encouraging generalization instead of just short term memorization.

The comparison to machine learning (they use ANNs) is: when the examples do not reflect the distribution from which they are coming from, then the learned model is less general. And in particular, when a new set of examples with different properties is trained on it interferes negatively with already learned material. In the paper they give examples of similar occurrences in rats and humans.

Finally, it is important to note that randomly interleaving examples is not useful when trying to figure something out - where short term memory is key. The observation is that contextual interference disappears for complex tasks, this is because the information requires more complex processing and integration in working memory. The ability to hold more information and more complex models in working memory correlates strongly with the ability to understand things. Interfering with this process leads to poor performance short and long term. As the material is familiarized, compressed and basic models are composed, difficulty reduces. It is here that interleaving becomes important.

The practical implications are when practicing drill exercises, rather than blocking the material according to similarity, interleave the material so a less biased model is formed and the material is more challenging to keep focus. When designing exercises break them up into the core simple, defining examples and interleave these while gradually increasing the difficulty by composing more difficult models from the simpler material and continuing to interleaving these with the simpler basis examples set.


That's quite a lot of conjecture about the mechanisms behind what is observed in the research, but how far do the research findings themselves take us? For example, I've read about how spaced repetition affects verbal and visual recall tasks, but has its effect on motor coordination been studied at all? It's a big leap to generalize from verbal and visual recall tasks to motor coordination tasks, yet here we have an article about a researcher who studies recall of telephone numbers and such, and the article blithely advises readers to apply his research results to learning a tennis stroke. That's a terrible application of science.

I could be wrong, but the article only mentions research on recall. The conclusions reported by the article match research results on recall. The article features one researcher, who happens to study recall. Now, if you think it's common sense that research on recall says little or nothing about hitting tennis balls or dancing the Viennese waltz, then how are you supposed to read the article? Science writing elides a lot for the sake of brevity and breeziness, but I have a hard time trusting that the author knew about research justifying the same conclusions in the case of tennis and dancing and decided that instead of mentioning how broadly the findings had been confirmed, he would lead his readers to believe that it's all based on one narrow path of research. (Also, I don't think it's a very good defense of science writing to say that an author didn't make a mistake in reasoning, he only encouraged his readers to do so.)


Oh, no I did not think much of the article. I had already encountered these concepts earlier. My examples are not based on the article but on research I have consumed. I pasted some links in my first post. Yes, this has been long studied for motor tasks. Actually much of the research on the matter is based on motor learning.


From what I recall, spaced repetition works best for simple motor tasks, and less well for skills of increasing complexity - say operating a traffic control tower.


This seems to be about memorization as opposed to understanding (or skills).

I've always seen myself as having a poor memory, but found that if I focused on understanding, I could remember easily. I thought this was because understanding itself is simpler (so there's less to remember), and one can reconstruct the facts from this understanding (also, verify them).

This is treating understanding as a theory, in terms of which the facts can be stated more briefly than without. An abstraction, if you like, that factors out commonalities/redundancies. If A always implies B, then you can just need to rememeber "A".

But later, I've realized that in fact, I do remember a great deal of detail that isn't derived from understanding. Perhaps understanding actually requires a lot of domain knowledge - facts - that the understanding is about. You can't "understand" in a vacuum. So, now I think that's it's just that somehow, causal connections and explanatory relationships stick in my memory more easily - they are certainly more interesting to me (because meaningful), so that helps with concentration and therefore memory.

I actually once tested my opinion that I couldn't memorize, for a university exam, and found - to my amazement - that I could. It really was surprising. But I didn't score nearly as well as I usually did, since I wasn't focused on understanding - which is what it is all about, IMHO.


Memorization gives you the building blocks of skills.

It has a really bad name amongst certain education commentators. It's certainly true that you shouldn't try to memorize in a vacuum, or that you shouldn't test "higher order" skills (like complex questions, which require a real understanding of the basic blocks), but memorization of connected facts or processess is essential.


The point is, for many of us, the best way to “memorize” things is to explicitly avoid trying to memorize them, and focus instead on understanding context and connections between things – after you’ve had to call up the same idea 5 or 10 times because it was related to something else you were learning about, recall just comes along.

Examples I’ve often seen used by proponents of rote learning include multiplication tables and spelling lists. I’m better at mental arithmetic than almost everyone I know, and I almost never misspell words, not because I spent any effort memorizing lists of them (indeed, I completely refused to do this as a 9-year-old, because it bored the crap out of me), but because I’ve spent more time in exploratory play with the relationships between numbers than they have, and have spent lots of time reading, doing close textual analyses for my own edification, trying to write tight fluid prose, and playing with various kinds of poetry, etc.

I’m convinced that rote memorization is promoted as a pedagogical tool because it is cheap and easy and requires little effort from teachers and schools, more than because it’s particularly effective. Then again, everyone learns differently, and I know some people who are incredible at chewing through lists of facts once quickly and dredging them back up effortlessly months later. YMMV.


This sort of thing seems to me to be just you stumbling on the spaced repetition effect by accident: you look up the same thing several times, at varying intervals as you learn it more.

I personally think multiplication is actually a great thing for spaced repetition if you can get the software to pick random numbers each time. I also find that things that I don't review regularly just fall out of my head, spaced repetition software is good for just stopping that relatively easily.


Speaking only for myself, learning things rote, via software-driven spaced repetition or otherwise, is deadly dull. On the other hand, finding new patterns, reading relevant books and deconstructing their arguments, holding discussions with other people passionate about a subject, trying to construct new ideas and relate them to past knowledge, etc. is exciting and invigorating. Living life as a human rather than a robot may be less than perfectly efficient relative to some platonic ideal, but really, so what? Similarly, I’d much rather go on a hike than run on a treadmill, I’d much rather cook and eat a delicious meal than compute the perfect mixture of nutrients and blend them into a shake–slurry, and I’d rather make one deep friend than fill a rolodex with business cards. My personal impression from having dabbled with spaced repetition software after a friend bugged me about it for months was that I personally learn better when learning things doesn’t feel like a visit to the dentist. Call it a character flaw. Again, YMMV.

My point about the multiplication tables and spelling lists though was that explicitly learning e.g. multiplication tables is pretty much unnecessary, because multiplying small numbers comes up again and again in doing more exciting and challenging problems, and by the 20th time you’ve had to multiply 6*8 through some explicit manual method, you’ll just start to remember that oh yeah, that’s 48 again.


While I agree that rote learning is very dull, it can be fundamental in situations where you know next to nothing of the subject at hand.

Consider for example two native English speakers, one learning German and the other learning Chinese. German grammar is way more complex than Chinese's, but because of the etymological closeness between German and English, the German student will quickly get to the point where they can easily pick up new words from the context (obviating the need for vocabulary lists), while the Chinese student will pretty much never reach that. In this type of situation an SRS is invaluable.

Regarding multiplication tables, you shouldn't forget that knowing them by heart is sometimes necessary for understanding the exciting problems. This comes up all the time in mathematics: to understand the important problems in a field, you sometimes have to get to know well a lot of seemingly boring concepts.


Knowing multiplication tables by heart is never necessary for understanding exciting problems; the exciting problems are not dependent on your arbitrary culturally-contingent choice of numeric base! I had discovered the commutative and associative properties of multiplication long before memorizing multiplication tables in fourth grade, and historically speaking, they were known long before the adoption of place-value numerical systems.


It might be essential, but it's also overrated as a primary measure of mastery. The more difficult the subject matter, the less useful rote memorization is in the end; there were plenty of people with near-photographic memories that I saw failing upper level math courses at university. And I knew plenty of people that excelled at that level but had to look up specific formulas all the time (as best as I could figure, they tended to remember quite well what transformations the formulas allowed, just not the exact form that they allowed them in).

That said, it's certainly not going to hurt anyone to increase their rote memorization abilities, in fact it's probably extremely useful in general, no matter what field you're in.


Yes, I've had similar thoughts. I think when successfully "understanding", people see something in multiple different ways, ask Why? a lot, then finally relax once they're sufficiently convinced. This may use different kinds of cognition, like visual, kinesthetic, etc. And it may feel like diverse parts of their mind are at various states of being convinced/unsatisfied.

Whereas, when trying to "memorize" something, many people just kinda... hope the thing stays in memory through some unknown process. But you can "cheat" by consciously leveraging various cognitive abilities. (http://en.wikipedia.org/wiki/Method_of_loci#Contemporary_usa...) (Along with various little tips, like use lurid imagery, it's probably better to visualize large spaces you can walk through rather than tiny ones, etc.)


That's my experience too. It's something of an ordeal. Perhaps each time we look from a different angle, while done for the purpose of convincing oneself, also forms a new connection (which is known to aid memory)?


This is part of the reason I've always found harder sciences/engineering classes way easier than say languages. Physics, math, etc. classes have very little "laws" that need to be memorized; everything else is linked. Languages though have thousands of words that need to be individually memorized.


One of the most effective ways to learn a new language(I've found) is to actively tie new vocabulary & grammatical structures to existing knowledge and etymological backgrounds. This makes it much easier to recall and use in familiar ways.

The corollary is my personal hypothesis that learning Latin actively helps you learn other European languages. In fact, learning French probably makes it easier to learn German, and knowing both gives you more background on which to build the English language. Eventually the limiting factor is memory and ability to differentiate language in context.


I'm not sure if this is the point you were making, but I'm really skeptical of the value of learning Latin as a route to learning modern Latin languages. It seems to me that if you want to learn, for example, Spanish, then you are considerably better off learning Spanish straight away and not Latin then Spanish.

And if you're getting carried away and want to learn a second or a third one, then Spanish will give you most of the value of Latin in terms of transferable knowledge.


I was the same way. Foreign language was extremely difficult. Then I realized that memorization itself is a process that can be learned, improved, and mastered.

Once I began studying and learning the process of effective memorization, then foreign language became fun. It was a way to prove to myself that memorization techniques worked.

Granted, learning a language is more than just memorization of vocabulary. But elimination of that huge hurdle makes the rest more interesting and tractable.

I was tipped off to this by the best seller "The Memory Book" (which I highly recommend), but now there are plenty of web resources that may be as effective.

Here are some relevant links: http://www.amazon.com/Memory-Book-Classic-Improving-School/d... http://en.wikipedia.org/wiki/Memorization (and click on all the links on this page)


Spacing of facts or study does help understanding and generalization, though, in those studies which specifically looked at that: http://www.gwern.net/Spaced%20repetition#abstraction


Why has no one mentioned SuperMemo? The software has been around since the 80's. I use it every single day (because you have to) for about 30 minutes to remember Latin vocabularly. It's based on spaced-repetition. If you feel a particular piece of knowledge is worth remembering permanently, I would recommend using it. http://www.supermemo.com/

There's actually a Wired article about the author of the software http://www.wired.com/medtech/health/magazine/16-05/ff_woznia... Completely eccentric and totally devoted to using the thing. I mean he doesn't even decide when he responds to his own mail, Supermemo will "schedule" it for him. I believe the latest version supports something called "incremental reading" which Wozniak claims is better than the usual order we read things. You read till you get bored/tired and them move on and it'll keep track of where you were, etc.


There's actually an open source app which functions similarly to super-memo, but uses an older spaced repetition algorithm and doesn't quite have the same feature set. Its called Anki (http://www.ankisrs.net), and its been a godsend for both myself and many friends who learn Japanese.

The thing I found interesting in this article is that switching contexts actually increases the amount you can learn. From what I understand, this would mean studying grammar, character reading/production and vocabulary at the same time would be beneficial. I never quite understood why this was the case before, even though people have told me it is so.


I also heavily use Anki for learning (Japanese). Although you can learn all sorts of things very efficiently with it once you get into the habit of using it. The program is great, it comes with >free< syncing/online backup, provides an iOS/Mac/Linux/Win/Android/Maemo client and a slick browser interface.


I really like that there are so many clients. Would really like some kind of cloud-sync + mobile client for SuperMemo. It also only works in Windows which means I have to run it under VirtualBox.

My only concern with Anki is whether it still uses some variant of the SM-2 algorithm and whether or not the people implementing the software are doing more than just polishing interfaces, etc. and actively work on improving the learning algorithm. Looking at the history, SuperMemo on SM-11 http://www.supermemo.com/english/algsm11.htm and Wozniak claims that there is a substantial evidence that this algorithm increases the speed of your learning.


Yes, Anki used to use a slightly modified version of SM-5 but the Lead-developer decided to revert down to a modified and improved version of SM-2. Basically he reasons (you can find his explanation on the Anki FAQ) that while versions newer than SM-2 are supposed to (slightly) increase efficiency, it only happens if you study every day and at roughly the same time of day. The reason is, that starting with SM-3 a failure on a certain card changes the difficulty of (determined by the algo) similar cards. A further explanation by the lead developer can be found here: http://markmail.org/message/u2zfnrg7x53bzp24#query:+page:1+m... I am convinced that it (at least for me) offers much better studying conditions. (Actually, the mere fact that I can revise on a crowded 30min commute on my ipod would make up for any efficiency increase).


It's really a shame that almost none of what we know about optimal techniques for learning is being applied in real educational institutions.


This isn't even approximately true. Almost everything we know about optimal techniques for learning is being applied in real educational institutions. But, not every teacher is as effective at applying them, and furthermore, it takes time to do a full overhaul of all parts of everything.

A more effective lament would be, it's too bad these techniques don't yet have universal application. But the above comment is just sniping; it's a claim that educators aren't even trying. Which, aside from being false, is acutely unhelpful.


I think what's truly unhelpful if people claiming that everything is fine, while the education system completely fails to meet even the most basic standards of efficacy (while costing a cool trillion dollars a year to run.)

At the most basic level Americans have shockingly low levels of scientific, mathematical, literary, or financial literacy. About half don't believe in evolution, about a quarter think the moon landing was a hoax, over two thirds don't know enough math to pass an 8th grade math test.

At the college level, students retain about 10% of what they hear in lectures. Forget nearly everything within months of graduation, and generally fail to improve on basic logic and reading comprehension skills compared to when they first entered university. Then after graduation the average college grad reads less than 1 book per year.

If you ask a typical college student (and remember I said typical, not those who can afford to pay 50k per year to attend an elite liberal arts college) what their university experience has been like they will tell you that in a solid three quarters of their classes they had a disinterested prof who showed up for 50 minutes to read them the textbook and then asked them to mindlessly regurgitate the content on the exam.

So I'll repeat that I think what's truly unhelpful is guys like you claiming that everything is hunky dory.


> So I'll repeat that I think what's truly unhelpful is guys like you claiming that everything is hunky dory.

"Hunky dory"? Not to overly pick on your own logic and reading comprehension skills, but I made no such claim. We have a lot to do, and there's a lot that is known that doesn't have universal application, and some things aren't even widely implemented yet. I was, however, disputing the claim that "almost none" of it is implemented.


Did you even read your own comment?

"Almost everything we know about optimal techniques for learning is being applied in real educational institutions. But, not every teacher is as effective at applying them..."

I guess my reading comprehension must be totally shot, because I interpret that as meaning that almost everything we know about techniques for learning is being applied, but the problem is just that some teachers aren't applying these techniques correctly. You know, instead of about 80% of teaching not giving a shit and there being no consequence to that for them.


One of the reasons for having midterm exams followed by finals in certain institutions is to apply the spacing effect described in the article. You study for a midterm, forget some of it before your final, then reinforce it at the end of your term. I found that it worked for me when I was in college.


[citation needed] I don't think midterms exist because of this reason.

And even if they were the separation is too much to actually be valid as a Spaced Repetition tool.


I find that the learning in different places thing is really key. In school I find that for exams where I've been forced (by schedule) to study everywhere (my apartment, friends, classroom, different libraries, coffee shops), I find it easier to recall methods and approaches. For straight up 'facts', I don't really find a difference, but when tackling various problem solving exercises, it really has a difference.

I also find that studying while listening to music is always risky, since you may find yourself craving music during the exam.

Honestly, I've read many books on learning and brain plasticity, and they all give somewhat contradictory results. Perhaps because cited studies rarely use the same approaches to learning/testing.

For example, in this article, we are told that 'topic-focused' studying is not the way to do it. But what they've really established is that in things which have multiple facets to work away at, an interleaving approach can produce better results. But what about learning in things in which there is a strong linear progression, or at least a strong 'prerequisite' relationship. In many technical courses, courses are often structured in this way, so that each 'topic' builds heavily on the last.

Honestly, the only consistent thread that I can really take out of all of this is that learning anything takes consistent, long term, DIRECTED work. It is possible to drop something your memory, and keep it at an 'adequate' level for a long time through somewhat consistent, more 'relaxed' work, but this 'knowledge' will leave you quickly as soon as you stop. Where as if you have long term directed work, it will end up embedding into your memory. After you stop, you may not be able to immediately recall these things, but after a quick refresher, you'll retain very very large amounts of information.


Another article on the same subject: http://www.nytimes.com/2011/09/11/opinion/sunday/quality-hom...

According to the article, there are three main ways to efficient learning:

1. Spaced Repetition - where the information is repeated, but with longer and longer intervals between each repeat.

2. Retrieval Practice - where you have to actively recall the answer or technique rather than being told or shown it.

3. Cognitive Disfluency - the harder you have to work to read or understand the material the more you will remember it. (I'm not sure how useful this point is. The New York Times article refers to research where the text was blurry, but from this it concludes that interleaving different topics would give a good results...I'm not convinced that would give the same result).


Everything You Thought Would Be Referenced Isn't

Lovely premise, but how am I expected to take this seriously without sources and references?


> Really, I recently had the good fortune to interview Robert Bjork, director of the UCLA Learning and Forgetting Lab...

That's the source. This isn't a research paper - it doesn't need cited references to be submitted to the journal of Wired.


It doesn't need to be, but it certainly helps expand the conversation.


If that's the source, it ought to link to a recording of the interview. What is this, 1995? Link a torrent if you don't have the bandwidth, Wired.


Learning hack I've found very effective (not that it is very original, but I was surprised how effective this is) - study with others, each of you should try to explain what you read to the others, and argue about what it means, why others got it wrong, etc.

For me it's best motivation to try hard to understand something, when I can show off shortly after :) And it gives immediate feedback - if you can't explain something, you don't really understand it. And when a few people are learning together, they find most of the patterns quickly, and it makes learing the rest easier.


Does anyone else get tired of Wired's sensational headlines, that come off as spammy? The logic in their headlines is often faulty. Which thus keeps me from reading their content, or taking them seriously.


Plenty of publications by Robert Bjork here: http://bjorklab.psych.ucla.edu/RABjorkPublications.php


I've tried SRS's and don't like them. They're only good for memorization, and if you're memorizing something that you need to use, using it is a -lot- better than using an SRS.

Plus, memorizing complex things with via SRS isn't easy. Figuring out how to break down the cards into small enough pieces is really hard for any non-trivial information.

For instance, I tried to use it to study for the PHP Cert from Zend. I passed the test, but only because I spent so much time researching to make the cards. The actual studying of the cards was pretty much useless after having done the real work of making the cards.

I tried to use it for studying languages, but I've found that the time making the cards is pretty much wasted (it's copy and paste, no thought involved) and studying them beyond the first few reps is pointless, too. If I can get through a few reps, and I'm actively using the language, there's no need to continue doing reps beyond that. I'm better off letting someone else make the list, study them like normal flashcards a few times, and then throw them away. (Better yet is something like iKnow that has the word, translation, picture and sentence.)

In the end, I've never actually found a use case for an SRS system that wasn't trumped by something else either because of easiness or just being better.


Everything You Thought You Knew About Learning Is Wrong

And it turns out that everything I thought I knew about learning is wrong.

These titles kill me. It's like the author has found out the ultimate gold mine. It's actually my strategy for learning for years. When approaching a particular topic/field, I start by reading related things to gain domain knowledge. This helps a bit later and accelerates my learning.


Yet another reason to write real code that does real things when you want to learn a new technique.


this is quite contradictory to the way I've been learning (especially in classes where a lot of memorization was required, eg. law or business administration). I'll definitely give this a shot the next time I'm cramming for an exam.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: