Hacker News new | past | comments | ask | show | jobs | submit login
The Dark Side of Expertise (lwn.net)
203 points by signa11 on Jan 24, 2020 | hide | past | favorite | 47 comments



> They then told the students to go up the hallway to a different room where there would be another test. What the students didn't know was that the test was actually in the hallway; the time it took each participant to walk down the hall was measured. It turned out that the students who had been exposed to the "elderly words" walked more slowly down the hall. Attendees might be inclined to call that "absolute crap", Brady suggested, but it is not, it is repeatable and even has a name, "the Florida effect", because Florida was used as one of the words associated with the elderly.

I haven't ready widely on this, but from what little I have, most attempts at reproducing this and other priming experiments have failed.

(see, e.g., https://replicationindex.com/2017/02/02/reconstruction-of-a-..., https://replicationindex.com/category/priming/ and https://www.discovermagazine.com/mind/reproducibility-crisis...)


I postulate that if this is a true cause/effect it depends on words having a strong association to the intended concept, and that concept being significant to the individuals.

Further, I would postulate that the selection of study participants at that time, and the construction of the test, were very well matched. Today there is probably greater cultural diversity and comingling, and may also there may be less unconscious association of those words to slow.


This was pretty ramble-y and pulled in some pretty unrelated examples.

* We never get a clear answer for why the Civic Center Engineers got the inputs wrong, so we can't conclude that it was because of their expertise.

* With the firefighters, there's a pretty easy proximal explanation, which again is not their expertise misleading them: panic. If anything, the solution to the problem shows that expertise was the solution to the problem, not the cause of it: by adding running without packs to their expertise in a concrete way, they were able to override panic.

* With the priming examples, I can see how this might be related, but the only example which actually involves expertise leading them astray is the baseball example. But this priming research, contrary to what's claimed in the article, isn't reproduce-able. And even if it were, these sorts of lateral thinking exercises have pretty limited applicability: detectives or theoretical physicists use a high degree of lateral thinking, but many careers don't, and I didn't get those examples from the article, I had to come up with them myself: that hardly speaks to this being an insightful article. And ultimately, it's unclear whether expertise leads detectives or theoretical physicists astray: surely part of gaining expertise in a field that requires a lot of lateral thinking involves building the skill of putting aside preconceptions. Again, I don't think we can conclude that expertise is the problem: it may be the solution.

Overall, this topic could be interesting, but this isn't an insightful article on it.


You are entirely missing the point in your first two points:

* The issue here isn't that the inputs were wrong, it's that the experts could not (or would not, if you don't want to be generous) accept that the structure was under-strength in spite of multiple reports; the believed their calculations rather than their lying eyes.

* There is no particular evidence that the firefighters panicked. Whether or not they did, though, they did what their training led them to believe was their only choice: keep their equipment and try to reach the top of the ridge. The foreman was the only one to realize either that the training wasn't going to work or that they had another option.

Do you have a great deal of expertise in this area?


> The issue here isn't that the inputs were wrong, it's that the experts could not (or would not, if you don't want to be generous) accept that the structure was under-strength in spite of multiple reports; the believed their calculations rather than their lying eyes.

How does this support the conclusion that they were misled by their expertise? There are many possible explanations here.

> There is no particular evidence that the firefighters panicked. Whether or not they did, though, they did what their training led them to believe was their only choice: keep their equipment and try to reach the top of the ridge. The foreman was the only one to realize either that the training wasn't going to work or that they had another option.

There is no particular evidence that the firefighters were misled by their expertise, either.

If anything wasn't the foreman's realization arguably due to his superior expertise?

> Do you have a great deal of expertise in this area?

In the area of basic logic, such as figuring out whether evidence supports conclusions? I suppose you could say I do, although I wouldn't claim anything special.


> We never get a clear answer for why the Civic Center Engineers got the inputs wrong, so we can't conclude that it was because of their expertise.

I think that's because you're focusing on a component, and not the whole. The construction project had a problem, and part of that is because they trusted the expertise of designers. Various people in charge of aspects of the project should have pushed back harder, noting that the expertise they were relying on was obviously false, since it didn't match reality. The problem is not the computers, but the designers, and the expertise in question is that of "trust your designers". The story and moral plays out the same whether it's a computer, or some bad math, or even purposeful sabotage.

Funnily enough, it might be possible to attribute your focus on the engineers as an aspect of priming itself. Even if you aren't an engineer or in a related discipline, this forum is very heavily frequented by such people and often focuses on such issues. As such, focusing so much on the engineers themselves and what they did or didn't do to the point if missing the whole can be expected. Well, it can be expected if you accept that priming plays a role, at least to some degree.


Re point 1:

> This was the early 1970s, he said, why were these engineers so confident in their calculations? As guessed by many in the audience, the reason for that was "computers". In fact, when they won the bid, they told the city of Hartford that they could save half a million dollars in construction costs "if you buy us this new, whiz-bang thing called a computer". It turned out that the computer worked fine, but it was given the wrong inputs. There was an emotional investment that the engineers had made in the new technology, so it was inconceivable to them that it could be giving them the wrong answers.


Yes but we don't ever find out why they got the inputs wrong.

Does this sound like expertise to you?


I think "wrong inputs" here is an odd way of saying that the model was bad (presumably to distinguish that problem from outright bugs in the design software or computer hardware failure).

https://eng-resources.uncc.edu/failurecasestudies/building-f...

says

« The roof design was extremely susceptible to buckling which was a mode of failure not considered by in that particular computer analysis and, therefore, left undiscovered. »


Okay but again: was this failure because their expertise misled them?


Not as far as I can see, no.

That article contains the following claim: « Computers, however, are only as good as their programmer and tend to offer engineers a false sense of security. »

which matches the "dark side of expertise" talk's bit about « There was an emotional investment that the engineers had made in the new technology, so it was inconceivable to them that it could be giving them the wrong answers. »

That seems to me to be a different issue to being misled by one's own expertise, and in any case neither source bothers to give any evidence that it's true (that is, that the computer's involvement was the cause for the unreasonable trust in the model's results).


No. In fact if they were misled by computers and they were not software engineers, which I had assumed they were not from them being called "Design Engineers" in the anecdote, it follows that they were not misled by their expertise but their assumptions of expertise from this unknown mysterious powerful new thing that a lot of money had been poured into.


They ignored the actual fact that it was sagging more than predicted and insisted the calculations were right. That might be ok, but someone allowed the project to proceed without explaining the contradiction. Real observations were dismissed in favor of believing in the expertise.


Hm. You're saying that inexpert decision-makers were misled by experts?

That is true, but I'd argue that the article is making a different claim: the article is claiming that experts are misled by their own expertise.


It seems like one difficulty is in knowing what expertise is important? Assumedly the contractors thought they had expertise, but were lacking. The firefighters assumedly thought they had expertise, but we're lacking. What expertise do I think I have, but am actually lacking?


Yeah, that's difficult. The only answer I've found is experience. Gaining experience for yourself is painful because it's slow, and part of the experience is consequences of your mistakes: in fields like fire-fighting (or rock climbing, which I love) your mistakes can literally kill you. So hopefully you learn form other people's experience and mistakes as much as possible.


The experts were led to disregard reports of sagging by their trust in the computer model.


>They ignored the actual fact that it was sagging more than predicted and insisted the calculations were right.

Seems like they lack the expertise needed to properly evaluate their model.


or experience can create a bias where you know you are right even when the evidence says otherwise.

Age usually humbles people by them experiencing this often enough they tend to check things... if they mature properly


> With the firefighters, there's a pretty easy proximal explanation, which again is not their expertise misleading them: panic.

Exactly, plus while his idea was fascinating, its understandable that the others didnt follow suit; he essentially invented the idea:

> Similar types of escape fires had been used by the plains Indians to escape the fast-moving, brief duration grass fires of the plains, and the method had been written about by James Fenimore Cooper (1827) in The Prairie, but in this case Foreman Dodge appears to have invented it on the spot, as the only means available to him to save his crew.

https://en.wikipedia.org/wiki/Mann_Gulch_fire


> "the Florida effect"

This 1996 study by John Bargh, using only 30 psychology undergraduates as test subjects, is now more known as the prime example of bad psychology, done with too small sample size and dubious statistical methods. This study is at the very center of the replication crisis in psychology. That the conference presenter has not followed at all what has happened in psychology in the past 8 years, erodes the scientific credibility of the presentation.

https://replicationindex.com/2017/02/02/reconstruction-of-a-...

https://www.nationalgeographic.com/science/phenomena/2012/03...

https://replicationindex.com/2019/03/17/raudit-bargh/

https://www.nature.com/news/disputed-results-a-fresh-blow-fo...

https://www.nature.com/news/nobel-laureate-challenges-psycho...

https://www.nature.com/articles/d41586-019-03755-2


Priming with respect to the Florida effect is a famously not replicable result. It's not a real effect. I'm more sympathetic to the word game example, yet it's not obvious how that translates to anything beyond.

With respect to the Hartford Civic Center, as always, it's a little more complicated than that. Yes the architects stood by their calculations, but this was more of a process issue than hubris. Despite the new techniques used, the plans were subject to peer review. Additionally the construction was not fully according to plan. Weaker struts were actually used than called for in the design, and that's on the construction manager rather than the architect.


> Weaker struts were actually used than called for in the design, and that's on the construction manager rather than the architect.

I was wondering if something like that was the case. It's common practice among structural engineers that they will not inspect the building because then they are liable if they don't catch deviations in construction from what they made calculations for.


The civic center arena roof collapse is similar to the FIU pedestrian bridge collapse in that the engineers trusted their calculations over physical evidence of imminent failure:

> At 9 a.m. on March 15, a university employee heard a loud "whip cracking" sound while under the bridge span, waiting for a red traffic light. At the same time, the design-build team met for about two hours at the construction site to discuss the cracks discovered on March 13. Representatives from both FIU and FDOT were present. The FIGG lead engineer's conclusions were that the structural integrity of the bridge was not compromised and that there were no safety concerns raised by the presence of the crack.

https://en.wikipedia.org/wiki/Florida_International_Universi...

Fortunately for NYC, when William LeMessurier received a phone call about Citicorp Center from a student, he went back and re-ran some calculations and realized that quartering winds would put his building under more stress than he originally realized. But no harm, that would be handled by the extra safety factor built into his calculations. Unless it turned out other changes had been made during construction he wasn't originally aware of. Well, I won't give away the whole story:

http://people.duke.edu/~hpgavin/cee421/citicorp1.htm

http://www.engineersjournal.ie/2015/12/08/citicorp-centre-to...


I was glad to see that the comments on the webpage provided a link (https://replicationindex.com/2017/02/02/reconstruction-of-a-...) to a summary of how priming research has been cast into doubt in recent years. Any talk, article, or premise that uses priming experiments to explain something, is a red flag for me. I'm not saying none of it really works, but I am saying that it is clear that something more complex is going on than we think, and many of the priming experiments, at a minimum, have been misinterpreted (and in some cases are just totally unreplicable).


I don't buy this headline at all, the author does not prove there are actual experts behind this story at all...

The primary story used here is that a sagging arena roof was consistently deemed safe by "engineers" (no proof of expertise provided) based upon their calculations...and this is somehow a "dark side" of expertise? It reads much more like ineptitude; that'd be like any coder here constantly ignoring terrible performance and numerous bugs and claiming their architecture is just fine.


There's also an unexplained leap from "the calculations are correct" to "we should proceed as planned even though it's sagging when it shouldn't".

Even if the engineers were infinitely arrogant and assumed nothing could possibly be wrong with the design, you'd expect them to react by saying "you must be building it wrong or using substandard materials", or something.


This is the same annoying problem I found with Malcolm Gladwell's Blink. He draws this conclusion that we can trust our first impressions, but it's really unclear how he comes to that conclusion: some of his evidence comes to no conclusion, and some of it directly contradicts that conclusion.


I'm starting to think someone was misled by their expertise into thinking this article was any good.


This is a write-up of a pseudoscience gobbledygook just-so stories that don't do justice to the complexities of the catastrophes they draw trite morals from.

Here's a more thoughtful write up of the Civic center collapse, attributed largely to a large project with diffusion of responsibility and no one empowered the fix the problems they were held responsible for. https://eng-resources.uncc.edu/failurecasestudies/building-f...


Just another study about expertise and how it can maybe not be beneficial:

https://www.bmj.com/content/357/bmj.j1797

From "Physician age and outcomes in elderly patients in hospital in the US: observational study":

> Physicians’ skills, however, can also become outdated as scientific knowledge, technology, and clinical guidelines change. Incorporating these changes into clinical practice is time consuming and can at times be overwhelming. Interest in how quality of care evolves over a physician’s career has revived in recent years, with debates over how best to structure programs for continuing medical education, including recent controversy in the US regarding maintenance of certification programs.

> Within the same hospital, patients treated by older physicians had higher mortality than patients cared for by younger physicians, except those physicians treating high volumes of patients.

Basically, expertise can change, and if you don't keep up, you may be following outdated advice (which might not useful or even possibly harmful). I feel like the "dropping your equipment" is a part of this. You need to know the right tool for the right job, and sometimes doing the same thing for years might lead you down the wrong path.


>I feel like the "dropping your equipment" is a part of this. You need to know the right tool for the right job, and sometimes doing the same thing for years might lead you down the wrong path.

This happened at my last job. The company I worked for hadn't really changed their methods for about 7 years. Jobs went slow, we worked ridiculous amounts of overtime and mistakes were a regular part of the job. Over the years I worked there, I went through everything, replaced a few processes, convinced the owner to get newer more efficent tools and in the end, cut down the amount of overtime we worked to almost nothing, reduced actual human operater times on our machines from about 2-3 hours per run to 20 minutes, and reduced mistakes and error from once a week or so to once in a blue moon.

The thing is, I didn't do anything amazing or groundbreaking, they could have been operating that way for longer, they'd just found something that had worked and never bothered to try and improve their system. I was left alone for a few years and given free reign to do pretty much whatever I wanted, I got sick of being overworked, so I started figuring out and changing whatever I could to make things quicker, easier and better and it really didn't take much in the end to make a drastic improvement.


TBH, if you gave me the list "dark," "shot," and "sun" I'm not sure I'd ever think of "glasses" as a word that went with all three, since I still don't really know what dark glasses are. I guess like dark sunglasses? Or maybe they mean "through a glass darkly"? Priming is a real effect, but that example of priming seems like it was poorly selected.


> Priming is a real effect, but that example of priming seems like it was poorly selected.

Except maybe it's not even a real effect: https://replicationindex.com/2017/02/02/reconstruction-of-a-...


Interesting! Thanks.


Well, it depends whether you happen to have that idiom in your personal dialect or not. The most high-profile use of it I happen to know of is from the 80s hit "The Future's So Bright, I Gotta Wear Shades", which starts out "I study nuclear science / I love my classes / I got a crazy teacher, he wears dark glasses"...


A really entertaining read, even if I am left wondering how much of the “Florida effect” stands up in the post reproducibility crisis times. And would have liked more details on the calculation failures. I would guess that there was perhaps less of a paycological bias and not a case of “you spend a large amount on a computer, if the calculations are wrong you are fired” type situation.

I’ve seen a lot of projects like this where pressure is built on project leads for the chosen solution to be correct and not just for the goal to be reached, which means you end up not looking at obvious alternatives because even if that solves the project “your solution failed”.

Does anyone know of similar stories or books that deal with engineering failures from a human psychology view?

Personally I’m reminded of the coverage of the Challanger disaster in “Ehat do you care what other people think” by Richard Feynman, which I can highly recommend reading if you haven’t already.


This illogical talk is because of "Thinking, Fast and Slow", Chapter 4, which Daniel Kahneman has since questioned.

http://www.decisionsciencenews.com/2012/10/05/kahneman-on-th...

USCSB has great work place accident round ups (CGI, no gore). Of course Expertise is what makes the workplace safer, it has no dark side.

Here's Murphy's law (If it's possible for it to happen, given enough time it will, so don't allow it to physically happen wherever possible over processes)

https://www.youtube.com/watch?v=Tflm9mttAAI


A eerily similar case of an eruptive fire killing firefighters happened not so long ago (in 2007) in Croatia.[1]

In this case it was most likely that the group of fire fighters did not have time to reach a safety zone and escape. [2]

[1] https://en.wikipedia.org/wiki/2007_Croatian_coast_fires#Korn...

[2] https://www.witpress.com/Secure/elibrary/papers/FIVA08/FIVA0...


Here's a link to the actual keynote if you want to watch it:

https://youtu.be/Yv4tI6939q0


Thanks for this link


OT, but not much, if anyone is interested in how/why buildings/structures stand up or collapse, there are a couple great books by Mario Salvadori:

https://en.wikipedia.org/wiki/Mario_Salvadori

Why Buildings Stand Up (1980) and Why Buildings Fall Down (1992)

that are elementary enough to be read by anyone, without being at all superficial.


There's an interesting analysis of the Hartford collapse case here: https://web.archive.org/web/20080108024915/http://www.eng.ua...


Wow: I lived in CT in 1978 and it was a wicked snowstorm: 10 foot snow drifts shut the coast down. I remember reading about the collapse in the New Haven Register.


Somewhat of a counterpoint - I didn't make it all the way through this book, but I think it has an interesting thesis:

The Death of Expertise: The Campaign against Established Knowledge and Why it Matters: https://amzn.to/3aHdIBB


I am super surprised to see this on LWN. But it was a fascinating read.


Just a note that priming has been a victim of the replication crisis.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: