1. Deliberate practice only works for skills with a history of good pedagogical development. If no such pedagogical development exists, you can’t do DP. Source: read Peak, or any of Ericsson’s original papers. Don’t read third party or popsci accounts of DP.
2. Once you realise this, then the next question you should ask is how can you learn effectively in a skill domain where no good pedagogical development exists? Well, it turns out a) the US military wanted answers to exactly this question, and b) a good subsection of the expertise research community wondered exactly the same thing.
3. The trick is this: use cognitive task analysis to extract tacit knowledge from the heads of existing experts. These experts built their expertise through trial and error and luck, not DP. But you can extract their knowledge as a shortcut. After this, you use the extracted tacit knowledge to create a case library of simulations. Sort the simulations according to difficulty to use as training programs. Don’t bother with DP — the pedagogical development necessary for DP to be successful simply takes too long.
Broadly speaking, DP and tacit knowledge extraction represent two different takes on expertise acquisition. For an overview of this, read the Oxford Handbook of Expertise and compare against the Cambridge Handbook of Expertise. The former represents the tacit knowledge extraction approach; the latter represents the DP approach. Both are legitimate approaches, but one is more tractable when you find yourself in a domain with underdeveloped training methods (like most of the skill domains necessary for success in one’s career).
As someone who's studied this field extensively and applied it professionally, I'll go ahead and say I wouldn't recommend Accelerated Expertise to anyone. Reading it I got the distinct impression that the authors did not understand a great deal of the research they cited, either when supporting or dismissing it. Some good criticism of the book can be found here[0]. If some acronyms confuse you, the main one to know is CLT, Cognitive Load Theory. The idea roughly is that humans have a relatively fixed amount of stuff they can hold in their head and the process by which we become more expert is by making "schemas" that allow us to automate processes and group many small ideas together. When learning you want to create robust, correct schemas quickly.
I disagree with this assessment. (Or, more accurately, I’d like to see a more comprehensive counter argument).
The article you link to is essentially a response to 2 pages in the book, where Hoffman et al mention, almost in passing, that CLT is a silly theory when you want to train for real world scenarios (the intuition is that if you’re training marine fire squad commanders to plan on the battlefield, perhaps it helps to simulate shooting at them during training?) Hoffman et al use this as an example of a learning theory that doesn’t seem to map to real world requirements.
This reads like a disagreement over one particular dismissal in the book, perhaps because CLT is a pet theory of the article’s authors. The problem: this argument is not core to the book!
The article does not, for instance,
a) Deal with the many examples of successful real world accelerated training programs with no curriculum design (as is commonly understood; ordering of simulations isn’t really designing a syllabus) in Chapter 9 (some of which were designed by some of the authors)
b) Have a rejoinder to the two learning theories presented in Chapter 11 that the authors claim underpins their training approach (if there were something to attack, this would be it!)
c) Nor have a rejoinder to a more central claim in the book, (and to my mind a more controversial claim) that atomisation of concepts impedes rapidised training.
And, perhaps most surprisingly to me, your claim that
> Reading it I got the distinct impression that the authors did not understand a great deal of the research they cited, either when supporting or dismissing it.
is remarkable, given that one of the authors of Accelerated Expertise is Paul J Feltovich, one of the founders of the field of expertise research, and a contemporary of Ericsson’s.
Sadly I'm not familiar with books that focus on self-learning. If you're struggling to apply the knowledge from CLT it suggests that you're mostly missing skills related to curriculum design like sequencing, scoping, and scaffolding. You might consider an introductory book into curriculum design (apologies, I don't have the name of a good one at hand). You might also benefit from reading something like "How I Wish I'd Taught Maths"[0] which is a great example of someone who understands these theories attempting to transform them into actionable ideas in the classroom. Following that example might help you in managing your own learning.
A mentor can teach you both the pedagogy and their own pet peeves/beliefs, so it’s not quite as bad as that. You do need to consult multiple mentors to get a breadth of so called tacit knowledge though.
The future is here, it’s just not evenly distributed.
About half of the things I used to obsess about fifteen years ago are now common if not dogma. I don’t obsessively track the things that don’t pan out but odds are I was dead wrong about a few of them. Still waiting on the rest.
Thank you so much for taking the time to explain that. This is something I wish I had read before going to uni. Knowing how to skillfully "extract knowledge" from an expert with a powerful question is a skill in itself. I wish there was more emphasis in education in uncovering tacit knowledge instead of simply being spoonfed the way it's been done for generations. This is helpful and I'm going to take this with me where-ever I go from now on.
I can recommend this review too. I read it a little while ago and it lead me to read the book itself, and at that point I was so hooked I started reading the Oxford Handbook of Expertise. I'm doing it a chapter at a time so it's not going fast, but it's very rewarding.
I think, by applying some of the core principles (variety of scenarios, high difficulty, guidance from expert available, high density of lessons, etc) I can learn things quicker, as well as help others learn things quicker. Even without CTA proper, which is its own skill I haven't taken the time to learn yet.
This exposes part of the reason behind, and bodes poorly for those subject to, the lack of success underrepresented groups have in certain fields. If "tacit knowledge" most expeditiously gained from interacting with experts is the most important aspect of skill acquisition in cutting edge (relatively-speaking) areas like applied STEM, then, of course, issues of mentorship and gatekeeping rise to primacy.
This has been my personal experience as well, and it makes me highly suspicious of anyone whose advice for acquiring technical skills is simply to practice constantly - "draw every day," "you have to code," "always be networking," etc. They either aren't aware of how useless this advice is, or simply don't care about your growth or performance. Which, I admit ambivalently, is reasonable in this society; if you want someone to care, pay them to. This of course opens us back up to the issue of underrepresented groups often being unable to afford formal "someone caring about your growth."
I will add though, some fields are far more open than others. Most notable would probably be software or computing in general. An observant person can glean insights from places like HN, Reddit, lobste.rs, GitHub/Lab threads, university course pages from all over the world, personal opinionated blogs, etc. etc. (add reading good open source code to this list too). I say observant because you need to sift through marketing (mostly HN and Reddit), influencer crap [0], (often attractive) polemics, etc. With computing, these places can come from genuine passion and respect for the craft, rather than grift like you'd find on LinkedIn [1]. From there a person can weigh up different ideas and experiences, looks for patterns, try to determine what is being implied and what context has gone unsaid, what is taken for granted, etc. This would replace otherwise unavailable mentors.
Like you, what you mention has been my personal experience. My experience with people from my demographic is that some, without formal guidance, struggle, and in getting started really need someone to hold their hand. In some cases this would be a problem of confidence and self-esteem more than ability. Others can soldier through looking to pick up bits of wisdom from wherever, eventually being able to make judgements of their own.
>and it makes me highly suspicious of anyone whose advice for acquiring technical skills is simply to practice constantly - "draw every day," "you have to code," "always be networking," etc.
It's a matter of perspective I think. This reminds me of that Ira Glass quote on taste and creativity. Perhaps taste is a sort of tacit knowledge, and this is picked up from mentors, or developed independently like I mentioned above. Then a person who does not know the experience of lacking taste -- either because they have access to mentorship (and took it for granted) or because they had the confidence and ability to independently "seek" tacit knowledge (and took the confidence and ability to actually do so for granted) -- lacks perspective and gives this advice. It "worked" for them, but really this is just the apparent, and they cannot express how they developed the taste or tacit knowledge (or that it is necessary!) for this advice to be useful.
[0] I specifically left YouTube off that list. Aside from conference videos posted to YouTube (hardly any views), university lecture recordings (hardly any views past first year with a few exceptions), and maybe a feeew other channels, YouTube content is in my experience garbage and very typical-influencer, mostly hyping up megacorps, the latest buzzword-tech, and grinding leetcode.
[1] Yes computing grift is very real, but I mean that it appears that sharing information online for "nobler" reasons or genuine professional reasons is more common in computing than in some other fields.
I like how Op introduces Naturalistic Decision Making (NDM) to address the question about how to accelerate expertise. But I am just a little critical/suspicious about this narrative, because Op and those NDM experts such as Gary Klein are working for companies that monetize this method. Is there any real world evidence that can soothe this sense of suspicion?
IMO Tacit knowledge is what we used to call judgement.
Judgement is basically knowing a bunch of things but having a good idea of which things are more important than others, and especially knowing which things are worth spending time on any which aren't.
It's like when I see someone talking about financial options and they put everything on the board at once: what are puts and calls, what's an iron butterfly and other strategies, do they need to know the payoff diagram, what about black scholes, what are the Greeks, when should I exercise, and so on. They are all things that a professional option trader knows, but when you're a pro you reduce your cognitive load because you know what actually matters and you are not juggling all these concepts at the same time.
It's also the source of frustration when interviewing. Say you've written CPP for many years, you're probably not prepared to answer questions on all corners of the language, even though you're an expert in some sense. Someone relatively newer might be better at that task, because they're thinking a lot about everything, including things they haven't decided are unimportant yet.
Oddly, my first specialty was performance optimization. A predisposition met a trouble project and I spent half my time finding ways to speed up code without making it offensive. Then later projects had problems but no mandate, so I got pushback. I got really good at crypto-optimization - changes that improve performance but look like something else (cleanup, bug fixes, feature requests), and found there’s a whole quadrant of code changes that answer both
legibility and speed.
There is very little magic or intuition based about any of this. It could be in a book. If anyone knows of a book that covers this, I’d love to gift it to people, because as far as I’m aware I’m seen as a weird (semiretired) street preacher on this subject. But I’m just tweaking existing recipes as it were, adding common ingredients in uncommon ways.
I guess Michael Abrash's Graphics Programming Black Book fits the bill, though no amount of literature replaces getting your hands dirty in late 80s-early 90s level hardware+software. However, while you can do this, you're still missing the things you learned by being part of a team (of the kind which there are probably in the low hundreds right now). How much I'd give to be part of Nintendo or Id software or some other era-appropriate team figuring out how to make the early big 3D games like Mario 64 or its contemporaries on the PSX. Of course you can bust out an SDK and try it out but it probably won't be the same.
I somehow became specialist at that randomly (like you, random troubled project thrown at me at the odd chance I d find someth as a cheap new joiner turned into a lucky break and a predisposition for enjoying hunting these) and I start now to see patterns and be helpful in all troubled projects Im thrown at as a joker... to a point I lead a performance optimization mini team now...
If I were to teach, which I kinda do as a leader of the team, I d just give a few shortcut: 99% of perf issues are stupid and discoverable by profiling (which nobody does somehow, the most experienced the least likely the dev will use a profiler and the most likely he ll have 50 wrong migration ideas to make everything better with a huge budget), once you compressed all leaves of code you can find on a hot path, look at why this hot path even exist, chances are it shouldnt even be called as often, if you did that and are still too slow, redesign, multi thread more, remove business need, expand the hardware.
You start also getting an instinct on scale: a troubled team behind by half an hour on a settlement process with an exchange, so big deal, told me they wanted to move the cloud to spin up hundreds of big server with a new scalable design that would take a year to develop with 0 skill in the team yet. How many operation do they need to do in the alloted time ? 30k, with a database on the back. Well, I havent looked yet, but this sounds to me like we can try something before the mega migration lol because 30k units of work is not much and databases tend to be misused (I once fixed a million statement sent by hibernate on a SOAP call, so...) but as you see I try not to give an explanation, it s just an instinct, because prejudice is what kills perf optimization: we'll have to profile each step.
One thing I try to teach is: never migrate to a new library while calling the slow code legacy: you'll end up either failing exactly the same way or fake-suceeding on a small sample and leave the actual failure to the next team who'll call your code legacy before moving to the next fancy fad framework of the year and start the cycle of disaster over again.
One of my weirdest fixes was removing a duplicate database request on an admin page that was taking 30 seconds to load. But instead of dropping to 15 seconds, the response time dropped to 3 seconds.
I've also removed a lot of 10% calls that improved performance by 20%. Profilers don't show the cost of your code to the CPU and Virtual Memory caches. And there are new or malloc calls whose full cost is paid by another thread due to the tempo of the allocations.
As you say, looking at the count column is a huge thing. But the biggest failure I've seen is one of imagination. So many people give up when the last tall tent pole has been addressed. Even if we've only hit a 3rd of our performance goal, they believe they've tried everything and we should move on. Nobody knows how to do perf analysis with a budget. For instance, the 'current users' count should not be entitled to 10% of the page load time.
Or better yet, how about just writing a series of blog posts or a newsletter?
Don't waste time on the setup, just register something on substack.com and I'm sure a bunch of folks here will subscribe. It sounds like a fascinating subject.
I once interviewed for a front-end role and had to google some CSS property name, as I still haven't committed to memory the difference between "align-items" and "justify-content", but I got dinged for that in feedback: grateful to have received feedback, which often isn't the case. It was one of those things I considered unimportant in the grander scheme of FE development.
This is one of those weird little details that pretty much doesn't matter, because you just try both, and you do that every time, and it's fine. Maybe knowing the difference saves you 15 seconds.
I had an aha moment a little while ago when I realized "justify-content" works like the "justify" buttons in Google Docs/other word processors -- left/center/right. It's not a perfect parallel, since justify-content changes its meaning if the flex-direction changes, but it matches with the default behavior.
This is why often when setting guidelines around particular behavior, standards that require some judgement may be better than rules that are to be followed blindly without any discretion.
Compare the rules around speed limits, don’t drive above 70mph (no judgement, could be too slow or too quick in a given situation) vs the one often observed and followed in reality, drive at a reasonable speed, roughly what others are driving at (use your judgement about what’s safe).
Likewise, you can be very good at fixing things without knowing what every single tool in your toolbox is even for. In fact, sometimes not knowing is an advantage -- you might use something in a way that was never intended that ends up solving a problem.
Your comment hit me right in the reality. I’m a uni student currently taking an OOP Cpp course. At times I don’t know what to ask or how to ask it because of the cognitive load. Value of, address at, private member function, pure virtual functions, bah! I’ll get there, eventually.
Tacit knowledge is the difference between making a decision and then acting, vs being free of the need to make a decision and acting right away. As in successfully.
Judgement is exactly the opposite of tacit knowledge.
“Judgment” and “tacit knowledge” belong in the same epistemic category as “magic!”, or god-of-the-gaps, or “complexity”.
They have no explanatory power, they only flag the absence of a complete explanation: specially, the inability at present to break down and reproduce the process that generated the correct behavior.
They should not be revered for their superiority, but scrutinized for the true source of power they draw from.
I'm sorry, I was really into the whole Deliberate practice thing and then I read this and thought "Oh no, did I have it all wrong?"
So I read the book he quoted - "Source of Power". And you know what? It has a tremendous overlap with Deliberate Practice.
The proponents of Deliberate Practice never claim to "just do it over and over" - you have to have background knowledge, a teacher/coach, etc... the tacit knowledge. Part of it is given to you and part of it is something that you're put into a position to experience for yourself.
Deliberate practice can also be developed in fields unlike music and math and chess, such as the pursuit of mastering experimentation & discovery which also takes a lifetime to never entirely master itself.
The combination of tacit knowledge with deliberate practice is especially powerful, even more so when augmented by working smart, in addition to working hard rather than instead of working hard.
Leveraged furthest when applied to a foundation of natural unfair advantage of some kind or another.
Indeed, this seems a little misleading (or perhaps I have too broad a view of deliberate practice). The submission talks about learning to ride a bike by first gliding for short periods, then longer periods, then pedalling. Is this not deliberate practice? Or identifying that you struggle with understanding software design, and then working on create designs and getting feedback to improve them so you can develop your own intuition for how to design good software - this also seems like deliberate practice to me.
I'll assume the best in the author, that he simply interpreted deliberate practice differently than I did such that it was not useful for most things, and now he's trying to share the method that worked for him. But to me this is exactly what I would expect from deliberate practice.
As someone who's recently gotten into deliberate practice, I, too, started reading this worried I'm heading in the wrong direction.
And it left me with a similar feeling: the author suggests that acquiring tacit knowledge is more important than deliberate practice, when in fact deliberate practice is the most effective way to acquire tacit knowledge.
In the bicycle example, that would mean trying out different approaches, figuring out what works and what doesn't, and working with a coach (parent) to practice more effectively.
So, pitting these concepts against one another seems wrong, but keeping tacit knowledge in mind when applying deliberate practice might make it more efficient.
I am completely with the author on his defense of the reality of tacit knowledge, but his argument against deliberate practice depends on pedantic, hair-splitting definitions that would be better presented as second-order refinements applicable in certain circumstances, such as when you are "in a field where no ‘highly-developed, broadly accepted training methods’ exist." It may be that this style of presentation is a reaction to highly specific claims that some proponents of deliberate practice insist are universal truths.
Indeed - after the long build up, the "reveal" was distinctly underwhelming. Arguments against a straw man are only persuasive if no-one notices that your foil is a straw man.
Yeah, I always understood deliberate practice as reflecting on what you're doing and focusing on parts which are wrong, i.e. figuring out where you might be missing the "tacit knowledge".
need things to be spelled out and made explicit. With that, we can function as well as anyone. Without that we suffer when we get advice like "be yourself", "read Dale Carnegie", etc.
Ty was lucky to have parents that understood that he needed this and made sure he got it. I didn't. Often it would take me a while to get what came naturally to the other kids and adults never seemed to recognize that, which made me feel really lonely.
While I agree that tacit knowledge exists and is widespread, I think the author misuses the word "tacit" to mean unspeakable rather than unspoken.
Clearly unspeakable knowledge is a subset of unspoken knowledge, but there is definitely unspoken knowledge that is speakable if we take the effort to make it explicit.
"things awkward people need spelled out" is a great example of tacit knowledge that is often speakable (but we don't bother because the tacit assumption is that the tacit knowledge is shared by all.)
By conflating the unspoken with the unspeakable, the author fails to realize that a great deal of our tacit knowledge can be made explicit or at least partially explicit (and the value that can be captured by finding new ways to explicate the formerly unspeakable.)
I understand this to be "difficult to express in a usefull manner", and many soft skills fall into this category. Plenty of people who are goosld at it put together advise that is actually worthless. Many of them don't even know why they are good at it.
No, I don't think it's a misuse of the word "tacit". It's a spectrum which has boundary line on one side only, but is open-ended on the other side. The boundary between the unspoken and the codified knowledge is clear-cut. The unspoken and unspeakable are just one spectrum, and the author's model does not have a boundary between them nor does he seem interested in finding such boundary.
In other words the unspoken knowledge can be learned from two sides, not one. It's understandable that the author focuses on their side, but I don't think he does so with the intention to discourage "the effort to make [the knownledge] explicit".
The author doesn't acknowledge any sort of spectrum and makes a general statement that trying to make tacit knowledge explicit is so hard as to be pointless to try.
> My take on this is that it is so difficult that we shouldn’t even bother; assuming that you are reading this because you want to get good in your career, you should give up on turning tacit knowledge into explicit knowledge and just go after tacit knowledge itself.
Maybe it's just my reading, but what follows this excerpt puts it in context. I.e. we shouldn't bother with converting the entirety of unspoken knowledge in some field:
> Why do we know this?
> [What many researchers found out] was that it is extremely difficult to encode all the possible branches and gotchas and nuances from a human expert into an expert system.
But the author doesn't say "don't only try to turn implicit knowledge into explicit knowledge", he says "you shouldn't even bother".
To me that sounds like a direct analog to "nobody will ever understanding everything about the source code without reading it, so we shouldn't bother with documentation".
Obviously we can't / shouldn't document everything, but creating documentation as you learn an undocumented legacy code base can be of great value (as can asking other developers to explicate their implicit knowledge).
Over and over the Author ignores the unexplicated but explicable and assumes that implicit knowledge is inexplicable.
"Be yourself!" cried the attractive, socially adept and successful person to the pungent, unshaved, overweight college-dropout loser in a ratty Futurama t-shirt who lives in a friends' basement and struggles to pay the exceedingly lenient rent.
"On second thought," the savvy adept said, "Strive to become the person you wish you were and hope that people will like who that is."
> Ty Tashiro writes about how "awkward" people need things to be spelled out and made explicit. With that, we can function as well as anyone. Without that we suffer when we get advice like "be yourself", "read Dale Carnegie", etc.
As an awkward person myself this certainly rings a bell. But for me it's more about getting permission. Growing up in an abusive household, everything that I did was like walking in an open minefield. You're running and doing things intuitively and everything seems normal until you're hit with an explosion of verbal and sometimes physical abuse.
I can relate to this. The "permission" aspect also cuts both ways sometimes, because I learned to be cautious and then _that_ became the thing I was doing wrong, according to others, who berated me for not being assertive enough.
What's worse is that when I left my home and had finally the permission to go outside, at ~14 years old, I was basically unable to make any decisions or take any initiative. I literally had to relearn how to behave in society and at work, one hurtful mistake at a time.
I don't think the guy is any sort of "certified expert" but it rings true to me. And occasionally I've tried to give people related advice but found I couldn't. For example: How do you know when you you're close enough with someone that you tease them in a way that they'll take well vs badly? I'm pretty good at knowing it, but I'll be damned if I can explain it.
tl;dr of the article is that when people are learning then explicit rules and guidelines are helpful. Once you get better at it, they are unnecessary and even harmful. People in the latter stage tend to give crappy advice like "just be yourself" because it works for them now, they don't really exactly know what they're doing (ie, the knowledge is implicit) and they've forgotten what it's like to be totally incompetent.
Anyway, perhaps the point is that while it's difficult to externalize how to "not be awkward" (and indeed it would likely be different for every social group) you can probably use rules to get along well enough, for long enough, to figure it out the hard way.
Since tacit knowledge can be gained experientially, watching videos on social skills where the actions are demonstrated can build your social skills just like being instructed on how to ride a bike can make it easier for you to learn how to ride a bike.
> Without that we suffer when we get advice like "be yourself", "read Dale Carnegie", etc.
I've grown to appreciate such useless advice. It is useless in answering the question you wanted help with but it is a strong signal that you can prune the person who gave such advice out of your social graph without any negative repercussions.
Couldn't you argue the person you would be pruning is a person that is most likely to appreciate you just as you are?
It seems reasonable to argue that if they are indeed in your social graph and they give you that advice, perhaps they like you for who you are and thus the advice at least from their perspective is good.
Making an absolute was overstating my case. In the good faith scenario, the other person does appreciate you as you are. Not so worth it to burn that relationship. Probably wise to make a mental note not to ask them for advice, assuming your goal was for more than being accepted by the person you just asked for advice.
At one point when my son was young and not doing well in school we had him tested. We were told he was a kinesthetic learner. At about that time he was getting into skateboarding. When he grew up he became a brilliant skateboarder who is well known for it in our hometown. He put as much thought into learning a new trick as I put into solving a programming problem. He still doesn't like school.
“Everyone is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid.”
No substantive evidence exists suggesting Einstein made this statement...QI [quote investigator] has identified an influential essay called “An Educational Allegory” that was written under the pen name “Aesop, Jr.” and published in the “Journal of Education” in 1898. The author was later identified as Amos E. Dolbear of Tufts, a prominent physicist and inventor. The essay emphasized the absurdity of using a single inflexible standard for assessing the achievement of each individual student.
Programmers have another common example of how a belief in transmissionism often doesn't work out as well as the author hopes: the monad tutorial.
Here are other examples I can think of:
- Learning programming in the first place. Many people struggle.
- Teaching rhythm to an older adult with no musical experience.
- Ear training. No explanation will substitute for practice.
This isn't to say that explanations never work and you shouldn't try, but rather their hit rate may be lower than you think, that coming up with the right exercises might work better, and that beta testing your work is important.
I’ll take this as a shameless opportunity to plug my startup: Pathbird (pathbird.com). Happy to chat more about this if anyone is curious: travis@pathbird.com.
I developed it with a professor at UMich to teach a data science course for non-CS grad students. He had a deep belief in tinkering, exploration, and one-step-at-a-time learning. And I’ve seen it work pretty well. Pathbird itself is a platform for instructors to build these kinds of computational, “guided” lessons that emphasize the experience and process of learning.
Far too many intro CS/programming lessons read like glossaries. Obviously the syntax of if statements and for loops is important, but starting there on day one encourages students to miss the forest for the trees.
How do you actually teach both Rhythm and Ear training? I'm struggling at both right now. I've paid a teacher, and it's helping, but now I'm curious how we actually teach these things.
These are both things where practice can help. For rhythm, as skybrian noted, try counting and clapping along. Then try writing down the rhythm of a melody or some other part of a piece of music. Have your teacher or someone with better rhythm evaluate your work. Keep doing that with more and more challenging pieces over time. BTW, just as an aside, most pop music vocal melodies are fairly syncopated and probably not the best choice for starting. Classical music _of the classical period_ is probably a lot easier (but things get more complex as you move towards the romantic and modern periods).
For ear training, you need to practice interval recognition, chord recognition, transcribing melodies, transcribing chord progressions, etc. And _also_ you should be doing a lot of singing, especially sight singing. Again, you'll need someone more expert than you to tell you if you're getting it right in some cases. But if you're transcribing a piece that has sheet music you can get instant feedback. And the same goes for sight singing. Sing it, then play it on the piano and see if it sounds right. Better yet, _record_ yourself singing it and play along on the piano. If the all the notes are the same, you win.
The other thing you should be doing is actually playing music! Doing lots of exercises is great, but the whole point of this is to make you a better musician, not just someone who's really good at notating rhythms of the music they hear.
(Source: I have a master degree in music composition and I taught ear training and theory during grad school. Plus I also struggled quite a bit with ear training (but not rhythm) myself.)
For rhythm, counting and clapping seems to help, avoiding the complexity of playing the right notes. Playing to a metronome slowly is good practice but my student dislikes it. I suppose rhythm games might help?
For ear training, I’ve read that singing helps, since it associates vocals with pitches. I’ve tried ear training programs and even wrote one, and it’s the sort of thing where you get better by practicing a little a day. Doing a lot of practice at once is just frustrating. I was doing better for a while, but stopped practicing and wouldn’t say my ear is all that good.
Recently I’ve been transcribing music, which probably helps some. It’s sort of like figuring out a crossword puzzle where it takes a few guesses to get the notes right.
A long article that is basically saying you can't gain deep expertise by merely reading instruction manual and executing the instructions over and over. Expertise comes from a lot of experience with various situations that develops a sense of what's right and what's wrong. That is not surprising to me at all.
I think you acquire much tacit knowledge by doing explicit practice. You practice and get better even though you don't really know why. The knowledge you gain then is not explicit, it is tacit.
Of course if you have a great mentor so much better. But this reminds me of the medieval guild system. You had to be an apprentice, you had to be an intern for many years. The thing is, highly skilled people could make their main goal to teach others. But economically it makes more sense for them to use that skill themselves.
Therefore we can speculate that a lot of tacit knowledge remains tacit because those who posses it don't want to share it, without good compensation, like having an apprentice who works for them for many years.
A Youtube channel could be made out of this idea: skill extraction*. For every field...
Akin a podcast/interview. Where the whole purpose is for the interviewee to share his tacit knowledge.
Experts go and are asked questions about specific situations. Said questions are made in such a way to tease out the tacit knowledge. (using the critical decision method)
... The more I think about it the more I want it to exist. Anyone knows of something similar? Or is interested to create it? or just interested to have it exist?
There's a lot of material like what you ask for audio engineering. In my experience, it is effective, but you have to have the deliberate practice hours in order to absorb the tacit knowledge. And you have to put in a LOT of hours seeing how someone works in order to catch the nuances that can't be verbalized.
^ I really wish this existed too. (Something akin to it does exist at least for certain niches, eg detailed web performance audits where the perf expert has an explicit emphasis on teaching the audience...)
I don't think not being able to teach how to ride a bicycle is due to being limited by words and language but more limited by muscle memory. Riding bicycle involves controlling multiple muscles at the same time, and doing it just without using subconscious and automatic part of mind is impossible.
It is the same reason as you can't teach someone to play guitar by words without actually making them play and see the mistakes, even though you are perfectly capable to teach which area to press and how to strum and even make the person memorize the sequence.
I've thought about something similar before in the context of explainable machine learning [0]. The author alludes to it - explicit knowledge is like an expert system. And I would argue that tacit knowledge is closer to a modern neural network that has learned from lots of examples. Asking such a system for an "explanation" - just like trying to encode knowledge in an expert system, inevitable doesn't give a satisfying result because of all the judgment and caveats involved in real live. A better approach is to judge competence based on past experience.
As an additional corollary, the tacit knowledge concept is a good argument against decision frameworks generally, which destroy information by trying to capture experience in a rubric.
[0] Getting more out of experts by focusing on results and not process: observations of people and neural networks http://marble.onl/managing_ml.html
That said, I believe it is possible (and useful) to slowly extract parts of tacit knowledge into decision framework. Kind of "best practice" lists in respective fields. This will help new comers avoid past mistakes and use that time/cognitive capacity to push the boundary of knowledge/skill to uncharted territory. Otherwise we will be stuck in a status-quo.
Take chess. There are advises like don't play such-and-such a move in this position. If you follow the rabbit hole you'll see that hundreds of chess masters had played those lines to weaken their position. And later scores had analysed those games to codify it into "don't play foo in baaz middle game". Of course GMs sometimes go against those advises because they are GMs.
It's almost impossible to advance a field if we all relied on tacit knowledge alone. Because it doesn't scale and we'll spend all our cognitive capacity in repeatedly committing same mistakes.
As an expert in my field I repeatedly force myself to question the source of my judgement when someone asks me "why" when I say "this API doesn't look right". When I do so I indeed realise that it violates some foundational principle and explain it. An expert who can't describe the source of their wisdom, once in a while, end up getting stuck IMO. Because they are not exercising that deductive part of their brain and come to rely more and more on their "wisdom". Over time they fail to adopt to changing conditions and become dogmatic and less relevant.
I think you can take it up a level from the individual to culture to see it even more clearly. I think one of the most glaring examples of tacit knowledge is simply physical agglomeration. If tacit knowledge wasn't that important, why is it that people pay thousands of dollars of rent just to be in the right spot while 'everything is available' on the net? Why does half of all VC money go into a bunch of zipcodes? Why do so few key professionals network like we're in 15th century Venice?
Physicality and proximity are incredibly important, ironically maybe more so in the world of 'knowledge work' than when it comes to physical activity. An incredible amount of knowledge work happens implicitly when people organize spontaneously without them even being aware. Alex Pentland wrote an interesting book on it called Social Physics where he tried to empirically measure how much more effective in-person exchange of information is.
first off I phrased it somewhat weirdly, I meant to say there is a very small number of high profile professionals who are close and influential, not few overall, but the point is that a lot of deals still go through very informal, tight-knit, often secretive hands in for example the tech sector.
The 'Paypal Mafia' is I think a good example of this. It's hard to imagine you get the same number of businesses out of a dispersed, random groups on the internet. Business knowledge and success even in 'meritocratic' sectors are still depend on a lot of tacit and informal relationships.
Somebody serious about deliberate practice will spend a lot of time seeking “tacit knowledge.” The author is possibly confused and thinks the term refers to repetition or reading or something. The original paper studies pianists and violinists (possibly chess players too?) - people who aren’t spending their exhausting practice in a library or chatting on discord except when it’s the most important thing to do
The strategic choice of how and what to study is also part of a good deliberate practice regime (interrupted from time to time in programming by needing to cram leetcode for no good reason, because why would a company want to hire somebody who was an expert at making modular easy-to-change systems when there are these heaps everywhere that need to be written from scratch in 15 minutes?)
I see your point, and I'm not equipped to articulate the distinction. It is fuzzy.
> Somebody serious about deliberate practice will spend a lot of time seeking “tacit knowledge.”
Putting words in the author's mouth, he's saying that tacit knowledge is more important in areas where (i) execution isn't clearly prescribed; to (ii) perform at a high level.
The author gets into how in certain fields, there's a "missing manual" and requires figuring it out yourself or undergoing an apprenticeship, prioritizing attainment of tacit knowledge.
Leetcoding is a "missing manual" expertise path and doesn't converge, IMHO, like chess, violin, athletics, etc.
I clicked this link because I’m currently building an MVP for a software product and getting a hang of the essentials has been going relatively well thanks to my tacit knowledge around computers/software.
However, what I did not expect was for the article to go into that Atul Gawande quote on the laparoscopic appendectomy. It just so happens that the MVP I’m building is part of a training program for laparoscopic skill and the other half of my job is making the actual educational content, including the appendectomy (I’m a jack-of-all-trades - we’re a tiny company). How strangely surprising!
I spend a lot of time testing our product with users, who all happen to be in surgical training. The professor of surgery with whom we collaborate, regularly brings up the kid/bicycle/teaching experience. Laparoscopic suturing is currently seen as an art form, and it’s a such a great example, so he usually just tells people to practice. In surgery more than in any other field everything is relative, mushy and tricky. You’re juggling manual instrument skill with medical knowledge, along with people skills (in the OR). Our approach has been to focus - with our products you only learn the skill but none of the other stuff.
I’m definitely saving this one, and going be exploring all the linked sources and materials. Thanks!
It was interesting to read the judo analysis and then the surgical example and then generalize by taking complex experiences (for lack of a better word) and deriving patterns out of them to understand what the important difference is. It seems like, in some sense, tacit learning might be viewed as 2nd order - programming at the type level and practice is programming at the term level. This seems to model the tennis analysis also; experts recognizing the posture of the opponent immediately prior to hitting the tennis ball to infer the type of serve.
I've had a number of situations where I've run into trouble on projects because I have tacit knowledge about the domain - but I seem to struggle to make my tacit knowledge consumable by others, which leads to friction.
For example, to launch a new product we need the obvious end-user-visible features - but also sufficient back-office features to be able to quickly fix the inevitable problems that crop up. Things like adjusting a customer's account balance, say to give them a quick credit when something else goes wrong. Basic stuff right?
But management - this has happened to me - will often say something like "why are there going to be problems? Don't you know what you're doing?". And of course I do know what I'm doing, that's why I'm assuming something nobody has thought of is going to go wrong, and I'm trying to create a way to help us mitigate it without taking the lustre off the launch.
The problem as I see it is that management is often seen as a generic function of business with a translatable skill set, but in most cases they don't have the tacit knowledge they need to make good decisions in the domain they're supposedly managing.
Being able to turn this tacit knowledge into something consumable by others would be a huge boon to project management IMO.
Michael Polanyi[1] introduced the topic of tacit knowledge in his short book (100 pages), "The Tacit Dimension". The blog series only mentions Polanyi in passing in its third article. And only indirectly references his book in a dismissive tone, because "this blog is a practitioner's blog: it is interested in what is useful, not what is `true`". Well, regardless of what the author's blog is about, it is still important to grapple with the original idea.
In his book, Polanyi also uses the idea of tacit knowledge to tackle a question from Plato's famous Socratic dialogue, 'Meno': "can virtue be taught?"
You can find an excellent English translation of 'Meno' in Hackett's "Five Dialogues"[2] publication. A superb short book with a great selection of five short Platonic dialogues; reading it gives a richer picture of Polanyi's ideas. It[2] also serves as a great intro to Plato.
I have no idea what he's trying to compare, deliberate practice (a method of learning) is one important way to learn tacit knowledge (a kind of knowledge).
How can you compare a method of learning to a kind of knowledge? This comparison does not make sense at all.
Exactly. The post doesn't make sense at all. Learning is a cycle b/w "tacit knowledge" and "deliberate practice".
There's also this chicken-and-egg problem:
(1) you can't absorb the knowledge well because you haven't practiced
(2) you can't practice well because you don't know a shit.
The article leads to a bad example. If i heard from my technical lead such sentence as "feel right" or "feel wrong", i would huh, let me take more research on my own then we'll retry again.
Seniority is different from juniority in the ability to make things explicit as much as possible, from the requirements to the implementation specification.
I think in software engineering, the more explicit, the better.
Nobody ever says that in professional environments. They’ll retroactively develop a plausible-sounding justification for their gut instinct. Sometimes, if they’re not particularly self-observant they’ll even believe the explanation they created is the reason they believe what they believe.
> Sometimes, if they’re not particularly self-observant they’ll even believe the explanation they created is the reason they believe what they believe.
Most people seem to believe their post hoc explanations. Or at least most people wont admit that they are post hoc explanations, it is hard to know the difference since they come up with those post hoc explanations with the intention to manipulate you. That is also the reason it is so hard to change peoples minds in an argument, they wont bring up the real reasons they believe stuff they just bring up the post hoc tings. People seem to think that exposing your real reasoning makes you vulnerable. Like yeah, you get vulnerable to learning new stuff and changing your opinion, that isn't so bad.
The title reads like a category error. Yes, expertise consists of tacit knowledge, but that doesn't tell you anything about whether deliberate practice is the way to acquire it or not. It only tells you that memorizing text won't get you there. To the extent that it's not a category error, it's a tautology: arriving at the office is more important than driving to the office, making a milkshake is more important than running the blender, and acquiring expertise is more important than any particular technique you might use to acquire expertise.
At the bottom Chin says what he thinks is the way to acquire tacit knowledge: apprenticeship. He should have called the article "Apprenticeship is more important than deliberate practice," which hinges on precisely what you mean by "apprenticeship" and "deliberate practice," either of which can be defined in ways to make the statement true or false.
To me pedagogy is mostly being companion to the newcomer and coupling with his feelings while reducing the search space. Avoid high risk, avoid long term block. Provide a fun spot to swim in and gradually increase the width.
Code review. The feedback can range from the bare minimum of acceptance all the way to tricks, tips, wisdom, philosophy and so on. Normally you only get the former, but you can get more if you ask for it.
Ehh. 99% of code review is useless nitpicking about whitespace and variable names. Frankly, I have learned far more about software engineering reading hacker news than I have through reviews.
WRT teaching people how to ride vehicles: I have taught a few people to ride a motorcycle and have myself been through a few courses.
What I found works best is a ~50-50 combination of verbal explanation of important know-hows(bike moves where you are looking, don't stare at the speedometer/front fork, look through the turn) and preparing the student to learn the tacit component on their own (counter steering is spooky action at a distance, just make it feel right and pay attention to how the bike reacts to the movements of the body)
This would not sit well in my field - I'm a classical pianist - because I'm getting hung up on the role that knowledge ('tacit' or otherwise) plays in what I do, vs. the reality of a very motor-focused activity. Yes, after years of what I'd call deliberate practice, certain heuristics emerge - stylistic conventions, performance practice, what fingerings work in different situations. All fine - and those elements of tacit knowledge are definitely enabling. But in rehearsal and in performance, there's very little cognition going on. It's very much a motor activity. Instead, I rather think that tacit knowledge and deliberate practice interact in mutually-enabling ways. Deliberate practice allows you to build mental models (~tacit knowledge) that serve to enable higher quality practice.
Perhaps I'm missing something here. But the hypothesis that tacit knowledge is more important than deliberate practice - without reference to the field we're talking about - is a bold claim. I know cognitively that the staccato of Mozart is qualitatively different than that of, say, Bartok. But that doesn't do anything until I develop the motor skill to make it so.
In the sport Ultimate Frisbee, tacit knowledge is key. I'd never heard of it before this article, but this makes a lot of sense, because it's a relatively new sport without a developed coaching system, and many adult players come to it as adults.
In college, the team tried to teach us explicitly and it made absolutely no sense. By playing for years, I've developed the simple instruction of "Stand where you do not want or expect the disc to go, and at the right time, run to where you do want the disc to go."
The part that takes hundreds of hours is understanding "where you want the disc to go" and that can't be explained clearly because it changes every five seconds. After 10 years, the answer is usually obvious to me, but a new comer has absolutely no idea why anything is happening or why they are not getting thrown to.
I speedcube, and deliberate practice is extremely important. I also code, and deliberate practice doesn't do much in that area.
I need to have a feel for the code, understand what chunks of code do without having to read every line. That's tacit knowledge.
Maybe the difference could be explained with racecar driving. Driving the track correctly, hitting apexes, downshifting at the right moment; that's deliberate practice.
Knowing your car, understanding what a particular vibration means, feeling when you're approaching your traction limit, knowing that your engine is weak from 4-5k RPM, that's tacit knowledge.
Deliberate practice can foster tacit knowledge, however. Going through the motions opens your mind up to understanding what's going on behind the relatively simple task you're performing.
You can do deliberate practice for technical creativity, I do it all the time. You basically take any problem, like "How to build a server", and instead of looking up guides on how to do it you just start breaking it down. So a server needs to connect ot other computers, so you go and figure out how to make such connections, and then how to make multiple connections in parallel, then how to listen to connections etc. And soon you will have a working server. That was how I learned programming. It is slower and more tedious than just looking it up, but it is good deliberate practice for getting more technically creative. Looking things up doesn't teach you much, figuring things out yourself is how you practice things in this domain. Deliberately not looking up solutions to known problems is like a super power, you don't want to spoil yourself there you can only figure things out before you have seen how it is done after that you just recall the solution.
I only skimmed a few of the articles but as far as I can tell it boils down to:
1) He says that the ideas of "deliberate practice" is wrong
2) Describes "deliberate practice" as repeatedly following instructions... which as far as I know, isn't what "deliberate practice" is.
3) Then he says what you really need to do is practice smartly (a lot) to gain "tacit knowledge"
... but isn't that is pretty much what "deliberate practice" prescribes? Practice a lot at the edge of competence (or just over it). It just doesn't explicitly say "to gain tacit knowledge".
I thought that when kids learn how to ride a bike they can (1) use those "small" wheels on both sides that allow them not to fall and learn how to balance.. or (2) their parent puts a metal bar behind the seat and uses it to help keep the kid in balance: when the kid falls too much in some direction the parent uses the bar to push the bike back. After some time the kid learns how to balance on their own.
I suspect it very much is a practical skill. I'd be shocked if genetics are the sole determinant in your wit - surely other aspects such as linguistic ability, social skills, knowledge of pop culture, confidence, etc. all contribute to wit. And those can all be trained.
It would take time, of course, and a lot of practice rather than just having it explained to you. But I don't see why it wouldn't be trainable.
I thought about learning to play violin when reading this article, a teacher might explain you everything but it's impossible to understand as it's something so unnatural and driven by feel and experience. In fact you start playing with not great technique for a few years until you develop the ability to start working on the finer details and so on
Both sides of the aisle in this debate appear to hold a common assumption that our current method of sharing knowledge (i.e. written and spoken words) are the best methods that we will ever have at our disposal.
However, should a day come when a new method circumvents this very crude way of knowledge transmission, this debate will be significantly changed or even rendered moot.
Interesting to me as I feel good infosec arch & engineers embody this. I've been struggling with how to educate large swaths of devs and ops people but the amount of knowledge is hard to distill, much of it definitely seems tacit as described.
Yes, tacit knowledge exists. Riding bicycles is a great example. Technical design is a terrible one. Design docs are so important partly because they force you to put some thought into making your implicit assumptions explicit, in a way where you and other folks can question and learn from them.
Yeah, I can't tell if being the kind of person to whom this feels like bullshit is a character flaw, but it bears no resemblance to the way I learned programming or the way anyone I work with programs. Everything I know is an explicit best practice I learned somewhere and wrote down, and everything people do by instinct is stuff like creating global variables everywhere. To be fair I'm only five years in professionally.
The rules of HN do a good job of keeping topics on topic and informative, but sometimes sterilise it of genius like this. Oh well, some things are worth taking downvotes for.
The basic idea is this:
1. Deliberate practice only works for skills with a history of good pedagogical development. If no such pedagogical development exists, you can’t do DP. Source: read Peak, or any of Ericsson’s original papers. Don’t read third party or popsci accounts of DP.
2. Once you realise this, then the next question you should ask is how can you learn effectively in a skill domain where no good pedagogical development exists? Well, it turns out a) the US military wanted answers to exactly this question, and b) a good subsection of the expertise research community wondered exactly the same thing.
3. The trick is this: use cognitive task analysis to extract tacit knowledge from the heads of existing experts. These experts built their expertise through trial and error and luck, not DP. But you can extract their knowledge as a shortcut. After this, you use the extracted tacit knowledge to create a case library of simulations. Sort the simulations according to difficulty to use as training programs. Don’t bother with DP — the pedagogical development necessary for DP to be successful simply takes too long.
Broadly speaking, DP and tacit knowledge extraction represent two different takes on expertise acquisition. For an overview of this, read the Oxford Handbook of Expertise and compare against the Cambridge Handbook of Expertise. The former represents the tacit knowledge extraction approach; the latter represents the DP approach. Both are legitimate approaches, but one is more tractable when you find yourself in a domain with underdeveloped training methods (like most of the skill domains necessary for success in one’s career).