Hacker News new | past | comments | ask | show | jobs | submit login
Richard Feynman: Do We Live in A Cargo Cult Society? (caltech.edu)
80 points by bertm on May 22, 2010 | hide | past | favorite | 41 comments



Absolutely. The biggest offenders in software engineering are the things people integrate into their identity (i.e. "I'm an Agile manager", "I'm a Java architect", "I'm a Lisp hacker"). Another red flag is when there are no clear ways of testing a hypothesis (i.e. "NoSQL databases are better for this task", "This management methodology is best for our organization", "Outsourcing is great if you do so-and-so"). Most, if not all, of the opinions regarding industrial software engineering are based in anecdotal experience, without the tiniest thought given to controlled experiments. C'est la vie.

PS: what's most confusing, for me, is that it seems impossible to fix all of the variables necessary to answer a question like "what technology is best for this real-world problem?". I've yet to hear of any good ways of systematically measuring the effect of technology choices on programmer productivity.


This is also the reason that fields like nutrition science aren't generally all that useful if what you want to do something like offer diet advice. There are just too many shifting pieces to offer a systematic explanation of what's going on, at least with the experimental limits we have right now. You can only track so many variables at once.


This is absolutely incorrect. We know way more about good diet and the effect of nutrition than we used to. We do not know everything, but claiming that nutrition science is not useful is horribly wrong and misleading.

If you are offering diet advice, you should absolutely make use of all that we do know: and that knowledge comes from nutrition science.


If our end goal is to offer everyday guides on what foods to eat, nutrition science has a really hard time offering anything useful.

The predictions are necessarily minutely targeted at particular nutrients in isolated trials. It's really not possible to come up with a diet plan that replicates those isolated trials. I guess what I'm saying is that we don't (currently) have the capabilities to run studies big enough to examine the whole system in all its intricate pieces.

You can do surveys - and I think they are probably the most broadly-useful current nutrition studies, because they give recommendations like "food-related disease is significantly lower in people who eat a mediterranean diet". That's an actionable prediction.

As another counter-point to my argument, isolated experiments seem to work well in providing recommendations like suggesting to stay away from carcinogenic foods, etc...


We're not that good at telling people what they should eat, but we're pretty damn good at telling people what they should not eat.


This presents an interesting paradox though because at the end of the day a decision is required. Do I eat thing X or thing Y? And in what proportion? So when a set of things is not well understood but a decision is required, many non science strategies enter. So what can be done about this?


I want to say cheesey things like "The only thing certain is uncertainty" and talk about the pangs of Hamlet, but instead I'll offer this bit of advice from Colin Powell I recall from previous HN threads:

Part I:"Use the formula P=40 to 70, in which P stands for the probability of success and the numbers indicate the percentage of information acquired."

Part II:"Once the information is in the 40 to 70 range, go with your gut."

From lesson #15 in this presentation about Colin Powell on Leadership (http://www.slideshare.net/guesta3e206/colin-powells-leadersh...)


You listen to science and not to crackpots who pretend to do science. Note that most advice regarding nutrition appearing in newspapers/blogs is total crap (as opposed to science).

If you can't distinguish science from crap (or can't be bothered to search for science) I would suggest you go with what seems reasonable to yourself.


That's the whole point though - science so far can't give much useful advice on what to eat, the topic is just too complex.

Science can give hedged explanations of what's been observed in relation to a particular nutrient, but that's a far shot from dietary advice.

What seems to end up happening is "we've observed a correlation between saturated fat consumption and heart disease" makes it out of the science world and gets turned into "avoid saturated fat at all costs" (and buy our replacement products). Then years later we realize it's a lot more complicated than that.


It was known at the time that correlation is not causation. That is a classic example of taking dietary advice from the incompetent.


you have to downgrade from induction to deduction. given the environment humans evolved in and what we know about primate digestive systems, what is unlikely to do me harm? answer: fruit, nuts, eggs. what is the biggest thing we do that monkeys don't? eat tons of processed grains.

simple answer: decrease processed carb intake, replace with fruit (primate diets are often 50% fruit). since eating no grains isn't an option, eat the ones considered healthier: corn, beans, brown rice.


And how long do monkeys live again?

That is not to say it is a bad idea, just that you do not present any evidence that grains are somehow bad. In fact, without considering all the other variables, one could just as well argue that grains are fantastically good for humans.


The archaeological and medical evidence (celiac disease) suggests that humans are not especially well adapted to eating cereal grains. That does not mean they are entirely bad, just that their hidden costs are relatively more likely to be whimsical and unpleasant.


And what do widely present allergies to pea- and other nuts, peas, carrots, tomatoes, citrus fruit, fish or shellfish (among others) tell us?


Don't forget eggs, bananas, peaches (and relatives), onions, carrots, etc. (include most foods, yes I've known people with fantastically, life threatening, sensitive allergies to these foods).

One of my favorites is the genetic differences over perception of the herb coriander. Some percentage of people hate it because it tastes like soap. While the rest don't taste any soapy taste at all and find it to be a pleasant seasoning.

People often make the mistake of conflating "evolved diet=good for you" with "ready availability of foodstuffs". Chimps will also happily eat a pile of steak and a milkshake if you give it to them. Alaskan Salmon are fantastically good for us, but they don't exist at our evolutionary homestead. Some humans have consumed dairy for so long that they have developed lactose tolerance, and some eat such a high protein diet that large amounts of carbohydrates (refined or otherwise) leads to very high percentages of diabetes.

IOW, not every human has the same genetic lineage, Coeliac disease is interesting for some variety of humans, but doesn't exist at all in other populations -- and even in the populations it does exist in, it has a pretty low frequency (generally less than 1% population in any area no matter the genetic predisposition of the population). Egg allergies (one of the cited foods in this thread as "evolution safe" are fantastically more common by comparison. It's the second most common childhood allergy in the U.S.


Celiac disease makes up a tiny fraction of the cases of gluten intolerance, which as much as 20% of the population of, say, the USA may have. (Source: Dangerous Grains, by James Braly and Ron Hoggan) Then add in actual wheat allergies. That's a lot of people.

So yes, people can have allergies to all sorts of things. I've been there (long story; currently it's only cow's milk I have to avoid) and it's not fun.


I'm going to have to partially dispute you. I'll get to the point, but first let's define clearly, what is an allergy, an intolerance and what's coeliac disease.

Some raw numbers, wheat (gluten) allergies affect less than 1% of the population (it's something like .48%). It's not something you can hide, you either swell up and stop breathing when you come in contact with it or you don't. It's actually hard to find in adults as most people grow out of it when they enter puberty. If you don't die within a short time from eating wheat (without treatment) due to an IgE antibody surge, you aren't allergic -- full stop. (yes, there are emergent allergies that occur later in life, gluten allergies are generally not considered to be in the class of those types of allergies). http://en.wikipedia.org/wiki/IgE

Wheat (gluten) intolerance is more prevalent. But so are most food intolerances. For example, it's generally considered that 3/4 of the adult population of the U.S. has a dairy intolerance. Intolerance != allergy. You can eat foods you are intolerant to. It might give you an upset stomach or bad gas or some such, maybe diarrhea and a really bad headache. But by those symptoms most of us are intolerant of high fiber diets, coffee and all forms of strong alcohol. It might be worse, like dizziness or some such, but if you don't have IgE antibodies in your system after contact with the suspect food it's not an allergy. In other words, probably 100% of the population is intolerant of some food some of the time. People often mistake "allergic" with "intolerant". Allergies are far more severe and critical than intolerance. Intolerance comes from a lack of an ability to process a given external agent due to some genetic predisposition -- like how many East Asians can't process alcohol properly or MSG makes my Aunt get a headache.

Coeliac disease occurs in about .75% of the population (as identified so far). It is a distinct and different disease from gluten allergies and gluten intolerances. Coeliac disease is an autoimmune disorder where the body destroys the lining (the villi) of the intestine as a response to the presence of the byproduct of gluten after it's been improperly modified by a faulty enzyme causing an IgA antibody flareup in the gut. It's only now really being diagnosed as it is easily confused with other bowel diseases prior (like IBS or Ulcerative colitis or granulomatous among others). Recent IgA antibody tests do a pretty good job of positively identifying the disease (but you have to be on a gluten rich diet for it to work, otherwise you'll have no antibodies to test for!). The number may grow, but probably not much, even wildly high estimates put it at under 3% of very localized, genetically predisposed populations. Whatever the case, it presents as a chronic disease, not an allergy. You don't eat wheat, then blow up, stop breathing and die. Your intestine slowly rots away, you become malnourished and your feces turns white.

Likewise, food intolerance != Coeliac disease != food allergy. The three things are different. You may as well bundle people with sun allergies and who have dry skin from harsh soap into your numbers if you are going to do that. The only common factor is that the causative agent is the same -- wheat. But even if you add up all the numbers of people with wheat allergies and people with coeliac disease, you likely end up with something less than 3-4% of the general population.

"Dangerous Grains" is an interesting book. But if all you look for is a duck, suddenly everything starts to look like a duck. I'd caution anyone from self-diagnosing based on descriptions of layman oriented symptoms from a book written to popularize an interesting theory. Many things cause precisely the same symptoms as gluten intolerance (and visa-versa). Dangerous Grains, while an great way to popularize recent science with the layman, is not the same as medical science.


"If you don't die within a short time from eating wheat (without treatment) due to an IgE antibody surge, you aren't allergic -- full stop."

What? With all due respect, this is a ridiculous definition of "allergic" and not at all the standard one. Are all the people with, say, pollen allergies not really allergic if all they get is a runny nose? I feel lousy when I eat anything containing cow's milk, and I have high IgE antibody levels for it, but I don't die. Nonetheless, that is an allergy.

I'm well aware of the difference between an allergy and an intolerance, as I thought I made clear by mentioning gluten intolerance as well as wheat allergy.

Dangerous Grains makes and cites plenty of references to the published literature, so it's hardly your average pop-sci book. A major point of it was that celiac disease is only one way in which gluten intolerance, that is, inability to properly digest gluten, can manifest.


Okay, so maybe "die" was a bit strong. :)

Immunoglobulin E antibody presence is the only meaningful definition for "allergic" in the literature. You are right, there are different degrees of immune system response w/r to IgE response. But no IgE? Not allergic -- probably intolerant -- may have something else.

Lots of things people say/think they are allergic to, they are actually intolerant of.

I'm not putting down Dangerous Grains, only cautioning that many people improperly self-diagnose. The authors are recognized practitioners in allergy medicine. I forget some of the the terms off-hand, but mistaken self-diagnosis when learning about new medical topics is sometimes called the "medical school student syndrome".

Just because many of the easily observable symptoms of Coeliac disease are commonly found in any random selection of population, does not mean that Coeliac disease is common. It doesn't help that most of the symptoms may or may not even be present!

But really here's the list of symptoms: Abdominal pain, Abdominal distention, bloating, gas, indigestion, Constipation, Decreased appetite (may also be increased or unchanged), Diarrhea -- chronic or occasional, Lactose intolerance (common upon diagnosis, usually goes away following treatment), Nausea and vomiting, Stools that float, are foul smelling, bloody, or “fatty”, unexplained weight loss (although people can be overweight or of normal weight upon diagnosis), Anemia (low blood count), Bone and joint pain, Bone disease (osteoporosis, kyphoscoliosis, fracture), Breathlessness (due to anemia), Bruising easily, Dental enamel defects and discoloration, Depression, Fatigue, Growth delay in children, Hair loss, Hypoglycemia (low blood sugar), Irritability and behavioral changes, Malnutrition, Mouth ulcers, Muscle cramps, Nosebleed, Seizures, Short stature, unexplained Skin disorders (dermatitis herpetiformis) Swelling, general or abdominal, Vitamin or mineral deficiency, single or multiple nutrient (for example, iron, folate, vitamin K), Type-1 diabestes, autoimmune thyroid disease, autoimmune liver disease, rheumatoid arthritis, addison's disease, sjogren's syndrome.

Almost everybody I know has half of these symptoms half of the time -- and the ones that are less common are from something that's not CD (hair loss? how about Rad poisoning. Short children? Pituitary irregularity. Mouth ulcers? How about being a skank in High School?)

Here, let's try some example differentials: Patient is 28, Female, Caucasian, blood pressure is normal, pulse is normal, height and weight are average. Symptoms: Anemic, Bloating, Constipated, Abdominal Pain, Nausea and vomiting, Bruising easily, Fatigue, irritability and behavioral changes, muscle cramps, swelling, vitamin D and Calcium deficiency in lab work.

Does she have a) Lupus? b) PMS? c) or one of the .5% of the adult population with CD? (hint, it's never Lupus).

How about this: Patient is 62, Female, Caucasian, BP is elevated, pulse is elevated, height is below average and weight are average. Symptoms: Anemic, Indigestion, Occasional Diarrhea, Bone and Joint Pain, Bone disease (osteoporosis), Bruises easily, Dental discoloration, Fatigue, Hair loss, Hypoglycemia, Malnutrition, Short Stature, unexplained, Hives, Vitamin and mineral deficiency.

Does she have a) Lupus? b) Old, eats a crap diet of mostly snack foods, works a high stress sedentary job and drinks lots of coffee and tea and just starting eating a high fiber diet for the first time in her life? c) one of the .5% of the adult population with CD?

Or let's try another: Patient is 34, Male, Caucasian, BP is normal, pulse is elevated, height is average, weight is above average. Symptoms: Thyroid disease, type 1 diabetes, fatigue, vomiting, foul smelling stools, diarrhea, irritability, nosebleeds, vitamin and mineral deficiency.

Does he have a) Lupus? b) Has hereditary Grave's disease, is fat, eats a crap diet of pizza, candy and Jolt, has poor social skills, and just ate a bad piece of three day old pepperoni and we should check to see if he has sepsis and swab for known food-borne pathogens? c) one of the .5% of the adult population with CD?

My point is that the observable presentation symptoms for CD are crap. They could be symptoms for anything. Hence the caution on self-diagnosis. It also doesn't mean people should go out of their way to inconvenience themselves to avoid grain products and food with gluten in it because they get bloated on occasion or have stinky poo.

http://digestive.niddk.nih.gov/ddiseases/pubs/celiac/ "Diagnosis involves blood tests and, in most cases, a biopsy of the small intestine."


Not to be rude, but did you read the fking article?

What your are describing is pseudoscience. It is entirely possible to use your random, naive, and quasi-accurate observations to generate a hypothesis. To come to your "simple answer" is the type of overly simplistic thinking that Feynman is criticizing.

"Science: It works, bitches." -- xkcd


Actually, you discarding his hypothesis without testing is the kind of thinking Feynman is criticizing.

He has a hypothesis. Let's try to break it.

If we fail, good for him ;-)


yeah, I don't normally talk about nutrition because I don't want to get into a huge debate over edge cases that everyone seems to love (oh look it happened in this thread too). in an absence of empirical data we have to use deduction or induction to arrive at a reasonable hypothesis that we then test.

I recommend good calories, bad calories.


>systematic explanation

Yes. Too many scientists aren't trying to explain things; rather they wander around looking for new correlations. There's an abundant supply of correlations wherever one looks.


And the number of correlations grows exponentially with the already exponential growth of the amount of data they can analyze.

It's been highlighted before we need better standards for what's meaningful and what's noise.


We aren't doing physics here. If we can't observe the effects of something easily, it probably isn't that important.


Actually, that is similar to the position of a physicist, just without the words "easily" and "probably". For instance, what happened before the big bang has no effect on the current universe, which is why there's so much research into the first moments of the big bang, and not into any point in time before then.

But to address your comment specifically, I couldn't disagree more. Unless, of course you're just trolling by making a rediculous statement without any backing evidence in a discussion about pseudoscience, in which case, well done.

First, all of the parent's examples are hypotheses that are hard to test, not hard to observe. It's hard to test just how much better NoSQL/SQL is for a given task, but not hard to see that there are differences in performance if you were to, say, implement both systems. I'd like to see you change the management methodology for a company and not see a change. Same goes for his outsourcing example.

Second, I do not think there is a logical connection between your conclusion and your premise. Since you haven't given a single iota or proof for your claim, I don't think I need any proof to counter it.

Finally, even if we aren't doing "physics" (what ever your definition of that is), why does that mean pseudoscience is acceptable?


Maybe there is no strict ordering for those technologies. Or maybe what works best depends on the individuals involved.

Also, there is the "no silver bullet" theory.


You and I agree. Sorry if I didn't make that clear, but the way I read the post at the top of this thread, each claim could be backed with appropriate evidence for and against (as Feynman says every scientist's duty is). For a sufficiently well defined situation X, the claim that "Y is the most appropriate solution for X" can and should be tested appropriately.

The "appropriate" amount of testing would also depend on the situation X. The 'difficulty of observation' may be a factor here, but is not the only one. If the situation doesn't require 'hard' evidence (e.g. the consequences are minor) that's fine, but I wouldn't say any of the above examples fall under that category, and people making decisions based on 'soft' evidence should do so knowing that it is what it is.

Also, whether there is or isn't a strict ordering, dependence on the individuals, or a silver bullet would all be facts turned up by sufficient application of the scientific method. This relates to Feynman's points on how it's necessary to do experiments where you "don't learn anything new" and again, how every scientist is ethically bound to report results that contradict their theory.


That's the kind of thinking that leads you into wondering how the hell your code base got so bad, how you fell so far behind the competition, etc. A lot of critically important aspects of modern software engineering and craftsmanship are extremely subjective. The difference between success and failure in software can often come down to having the right group of people with good judgment in charge.


I'd argue that code contests like Google Code Jam are the closest to 'controlled experiments' that we have in software engineering. There's a set of specs of what is to be implemented, and a limited time in which to do it. (With different contest structure/judging criteria to testing different things like, speed, adaptability/maintainability, etc.) By trying to entice as many programmer/programmer teams to enter, differences in individual programmer is hopefully lessened.

Even your question, "(What is) the effect of technology choices on programmer productivity?", tries to treat programmers as interchangeable as if tech choice A is would make all programmers more productive. Programmer productivity, like intelligence, is something that if you'd have a license to print money if you could accurately and precisely measure.


Be careful. Software Engineering is not a science. Computer science is a science but, as it is explained so well in the SICP videos I watched, compsci's relations with computers is the same as astronomy's with telescopes.

Software engineering can - and should - use scientific method, but it has more to do with carpentry than with physics.

And I, personally, regard medicine as the biggest offender. Homeopathy should be a criminal offense.


I'd argue that code contests like Google Code Jam are the closest to 'controlled experiments' that we have in software engineering. There's a set of specs of what is to be implemented, and a limited time in which to do it. (With different contest structure/judging criteria to testing different things like, speed, adaptability/maintainability, etc.) By trying to entice as many programmer/programmer teams to enter, differences in individual programmer is hopefully lessened.

Even your question, "(What is) the effect of technology choices on programmer productivity?", tries to treat programmers as interchangeable when tech choice A is better for all programmers. Programmer productivity, like intelligence, is something that if you'd have a license to print money if you could accurately and precisely measure.


I feel like we go round and round, dealing with the same problems that our ancestors had to deal with.

What I see here is part of the human condition. Pitfalls that "most" people are doomed to repeat. People are born with similar ways of thinking. I would propose that intuition is most always wrong. This is one reason we must educate people. Unfortunately, Memories and Communication are not precise. And I realize that all job fields require some sort of education or training.

There exist Complex ideas, technologies, material things (eg. biology): that are not comprehensible by one single person at a time. This is why we have groups of people that make products and not individuals.

Also people age, lose their usefulness. If only you could educate people who would increase their intelligence, and never die.

We live in a world of conclusions based on facts. A conclusion can be wrong, a fact cannot. News media, blogs, conversation are an exchange of conclusions. We can't communicate in just facts because we can't work with that much data at a time.

If only, one could link to another: a computer or another brain and be able to access knowledge and its accuracy. Is this possible, I don't know.

There are many problems worth fixing (but probably won't any time soon): a phonetic english alphabet with a different letter for each unique sound, Human Language to AI command compiler, optimize human mental process (engineer the human mind), build biospheres in inhospitable regions of the Earth, Fast internet that is cheap (100Mb/s (and I mean Megabit)) -> HD remote controlled robots, Tactile feedback for TV just like we have visual and sound, Engineer ways of rejuvenating/regeneration of humans so we can control our lifespan, Creating or finding what makes humans "sentient" or "have a soul" or "conscious" or "self-aware" -> Preserve this "consciousness" without a body for long trips of space flight


Choice quote:

  The first principle is that you must not fool yourself 
  -- and you are the easiest person [you can] fool.


There's one paragraph I don't get:

    "I tried to find a principle for discovering more of these kinds of things, and came up with the following system. Any time you find yourself in a conversation at a cocktail party in which you do not feel uncomfortable that the hostess might come around and say, "Why are you fellows talking shop?" or that your wife will come around and say "Why are you flirting again?"--then you can be sure you are talking about something about which nobody knows anything."
I've been trying figure out what he meant, so I can apply the same method.


He meant that if you ever catch yourself in a conversation where no one knows anything about the subject matter, you should end the conversation.


I think he's saying that certain topics allow everyone to participate because no one knows anything. If specialists were in the conversation, it would become shop talk.


Previous submission:

http://news.ycombinator.com/item?id=993150

A good book for background on how people in general think is What Intelligence Tests Miss by Keith Stanovich.

http://yalepress.yale.edu/yupbooks/book.asp?isbn=97803001238...

http://www.amazon.com/What-Intelligence-Tests-Miss-Psycholog...

As Feynman points out, "The first principle is that you must not fool yourself--and you are the easiest person to fool." Human cognitive illusions are part of the human condition, and every scientist has to guard against them ceaselessly.


I just finished reading this: http://www.nthposition.com/thelastcargo.php

Very interesting. I especially liked the term "cargo prophets."

The article paints cargo cultists as performing, with fervor and rigor, a "magical repetition" of an "inexplicable complex of delusions."

And the cargo prophets, being the messengers, expound new canon to the "John Frum" myth which further glorifies and necessitates this "inexplicable complex of delusions," or rituals and methods.

The methods and rituals these cargo prophets ask followers to perform are NOT reason or integrity.

I could go on with my thoughts on this, but basically yes. I do see a very clear parallel.


How can I get a link to the scribd HTML5 reader?


I'm not sure. With the big announcement that was made about how they're switching and everybody here on HN celebrating, I was expecting all the [Scribd]'s to turn into HTML5 links. Is that because the HTML5 isn't ready yet, or because the HN code hasn't been updated?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: