"By the end of that summer of 1983, Richard had completed his analysis of the behavior of the router, and much to our surprise and amusement, he presented his answer in the form of a set of partial differential equations. To a physicist this may seem natural, but to a computer designer, treating a set of boolean circuits as a continuous, differentiable system is a bit strange. Feynman’s router equations were in terms of variables representing continuous quantities such as “the average number of 1 bits in a message address.” I was much more accustomed to seeing analysis in terms of inductive proof and case analysis than taking the derivative of “the number of 1’s” with respect to time. Our discrete analysis said we needed seven buffers per chip; Feynman’s equations suggested that we only needed five. We decided to play it safe and ignore Feynman."
Feynman is great, very amusing and obviously brilliant. But I've also worked with folks like him before who end up as fish out of water in fields they don't know the conventions for and can't be bothered to learn.
Lots of time is wasted trying to figure out how to get the rest of the company to communicate with the lone genius, and it's obvious that they're smart enough to go away for a week and learn enough of the field they're working in to try to use the language and vocabulary of that field to be minimally effective.
I've been on the receiving end of these kinds of analysis and the result is that they seem to exist purely to showcase how smart the individual is and to provide no other meaningful input to the effort. In this case the engineers ignored Feynman, did their own analysis anyways and followed their own conclusion...the subtext here is that multiple people were not getting along with Feynman's way of doing things.
He turned out right in the end of course, because Feynman, but there's lots of people who think they're Feynman and aren't and it's hard to tell the difference sometimes.
> He turned out right in the end of course, because Feynman, but there's lots of people who think they're Feynman and aren't and it's hard to tell the difference sometimes.
Well he had a Nobel prize in physics on his hands for starters, that's one way to tell the difference. Granted, noone's infallible, but it's a good predictor. Not that it would have spared him from fizzbuzz on the interview these days..
This is a great point. However, I would generalize a bit and say the problem is sometimes beyond ego; when someone's technical aptitude is greater than their ability to communicate. This is a more common problem than just the smartest guy in the room, and at some point you either trust each other, or agree to disagree.
That said there is an individual far worse than the brilliant teammate that's hard to understand. These are people whose ability to listen is worse than their technical aptitude. Just one of them in a meeting will destroy all progress, and disrupt any ideas that don't mesh with their preexisting thoughts.
On the flip side, it's also frustrating to know you're right about something, but because your co-workers don't have the same education as you they completely ignore you.
They are both real problems. Anecdotally, I've seen a lot more cases of specialists failing to communicate effectively than of non-specialists ignoring well-expressed advice. It's kind of endemic among techies, unfortunately.
I think you and the parent are agreeing. :) They didn't know what to make of Feynman's analysis, because it used techniques they weren't familiar with. It was clearly an approximation; were the error terms really ignorable?
They only used 5 when it turned out they couldn't manufacture 7 and had no other choice. So they only trusted his analysis out of desperation / wishful thinking, not out of objective reasoning.
But parent comment is using this as an example to complain about experts from different fields wasting people's time or whatever. But this is a terrible example to use. Feynman brought new insights that never would have occurred to computer scientists. And of course they were cautious at first. But they ended up trusting it enough to go through with manufacturing it. And he was right in the end.
Additionally parent comment complains about people like Feynman not being able to communicate. But there is nothing about that in the story. It goes on and on about how Feynman was a great communicator and explained his ideas clearly.
Lots of time is wasted trying to figure out how to get the rest of the company to communicate with the lone genius
Feynmann himself disdained this sort of lone genius puffery. He was critical of this in Murray Gell-Mann. I don't think Feynmann was a loner at all. He had people he liked and could relate to, and other people he didn't like and relate to, just like anyone.
I've been on the receiving end of these kinds of analysis and the result is that they seem to exist purely to showcase how smart the individual is and to provide no other meaningful input to the effort.
Bingo! Always look at the cost/benefit to the group. Everyone who seems like this on the surface tells themselves that they are striking a blow for the truth. But what is the overall effect? Did they primarily bring the group's perception of the truth closer to reality, or did they mostly establish themselves as the Alpha nerd? It's usually both, but what is the mix? Is the undertone one of kindness, or one of cruelty?
there's lots of people who think they're Feynman and aren't and it's hard to tell the difference sometimes
Feynman really liked continuous solutions. It's too bad he wasn't around for the deep learning era. Or for bufferbloat.
The "average number of 1 bits in a message address" makes me think of routing in the NCube. Like the Connection Machine, the NCube was a big array of small CPUs. Bigger CPUs than the CM, though, and running independent programs. The NCube had 2^N CPUs, up to 1024, and each was connected to N neighboring CPUs in N dimensions. Each CPU had an N-bit ID, representing its position in the binary N-dimensional array. The connections between CPUs were thus between ones that differed in address by exactly 1 bit.
So routing worked by taking the current CPU ID and XORing it with the destination ID, then inverting. The 1 bits then represented possible paths to neighbor nodes on which the packet could be sent. Any path would work, and all possible paths are the same length. At the next node, there would be 1 fewer 1 bits in the difference. When there were no 1 bits, the packet had arrived.
Nodes need packet buffers. How much buffering is required? That's probably what Feynman was working on.
The number of 1 bits is a measure of distance to destination. The average number of 1 bits is a measure of network traffic. As a discrete problem, this is a mess. Feynman converted it to a continuous flow problem, for which there's known theory, some of which Feynman had developed. He'd done a lot of hydrodynamics work at Los Alamos.
Van Jacobson, who started as a physicist, also saw network congestion that way.
"By the end of that summer of 1983, Richard had completed his analysis of the behavior of the router, and much to our surprise and amusement, he presented his answer in the form of a set of partial differential equations. To a physicist this may seem natural, but to a computer designer, treating a set of boolean circuits as a continuous, differentiable system is a bit strange. Feynman’s router equations were in terms of variables representing continuous quantities such as “the average number of 1 bits in a message address.” I was much more accustomed to seeing analysis in terms of inductive proof and case analysis than taking the derivative of “the number of 1’s” with respect to time. Our discrete analysis said we needed seven buffers per chip; Feynman’s equations suggested that we only needed five. We decided to play it safe and ignore Feynman."