This reminds me of the drawing of WWII bomber planes with bullet holes mapped out, showing that the critical systems were where there were no holes, not because the hull was strong enough, but because a hit there would cause the plane not to return and be recorded.
There's an episode of Sienfeld about this. George realizes that he always makes the wrong decision so he decides to do the opposite of whatever he would normally do, and it ends up working out for him pretty well.
There are some real life “George” figures that I pay attention to. If they think something is a great idea and that’s in line with something I’m thinking about doing, I’ll take a hard look at what I might be missing since X’s judgment is usually terrible. In an odd way, their consistently bad judgment has proven a valuable tool.
Most of the best decisions of my life were the opposite decisions my parents would have made. They're not stupid people, but they grew up in a totally different time and place. Nowadays I sometimes ask for their advice as "inverse input."
I suspect part of this is that there are other parts of society (corporations) that are preying on their older intuition. It also means that as we get older, we should also question "obvious answers", because others will have learned to take advantage of them (and any accumulated wealth).
Yeah, I would like some qualifiers too because I've seen reddit hype up relatively obscure product niches at the time that later became popular with mainstream consumers.
There are some political figures I do this with - if I'm unsure about a issue, I'll do the opposite of what they say. Bernie Sanders is probably the biggest - even his own party usually votes the opposite of what he wants.
There are also some true idiots in the Republican party that work similarly.
Perhaps Charlie Munger's enthusiasm for this sort of inverse reasoning explains his (now cancelled) plan for the windowless dorm building he designed for UC-Santa Barbara -- he was creating the worst possible dorm building in order to get insights into what a good dorm building would be.
I love deep dark places. My favorite memory is of visiting the Grand Canyon Caverns. I loved when they turned out the lights and it was absolutely pitch black. A dorm like that would be amazing for me. No street lights, no car noise, no windows for people to peek into. However I can see why it would make a bunch of non troglodyte people feel weird.
Decommissioned lookout Bunker stations here in the cliffs over looking the sf golden gate have extremely deeply sunken thick concrete buried barracks. we turned off the flash lights and it's the darkest I've been in and absolutely quiet with a group of 30+ kids and chaperones. Had better ears back then so no ringing ears. And sunken structures have the benefit of about 65° stable temperature compared to the cold outside air
I've had tinnitus for as long as I can remember. I kind of like the sound of the ringing. Reminds me of tibet singing bowls or a buddhist bell. The sound of my life.
Or, maybe, Munger figured out that what was viewed as obviously correct wasn't actually correct, and created a unique design to explore the possibilities when the obviously correct thing was removed.
The rebuttals to his design are basically just "every room needs a window", without any real justification. Do you think he didn't think of that? That he just forgot to put windows in?
My dorm at college had windows, and they were entirely worthless. There were 2 windows, 2'x6', frosted glass, and they could open about 3". I don't think they added much to the room
> Or, maybe, Munger figured out that what was viewed as obviously correct wasn't actually correct
Nah, looking at the floor plans, he was just optimizing for cost at the expense of user experience.
To rebut your anecdata with my own: My dorm rooms at college had large 5' x 4' windows that opened. The unlimited fresh air and view over the campus helped me keep my sanity while studying for a 4-year engineering degree. A windowless room would have driven me stark raving mad in the first year.
> The rebuttals to his design are basically just "every room needs a window", without any real justification. Do you think he didn't think of that? That he just forgot to put windows in?
I think Munger should build a dorm room for himself that doesn't have windows and live in it for a few years before deciding that other people don't need them.
> The rebuttals to his design are basically just "every room needs a window", without any real justification. Do you think he didn't think of that? That he just forgot to put windows in?
Whether he thought of it or not, bedrooms without windows are unsafe because in a fire you're trapped. Safety regulations are written in blood.
This is basically the idea behind Chaos Engineering. Think of all the ways a system can break, make it happen, and see if we survive. If we don't, fix it so we do next time.
Simply being aware that there exist things that you don't know you don't know can save your project.
This is the general basis for why I tend to pick tools & concepts that are at least a half-decade old. The space of unknown unknowns in something that has been around this long should be vanishingly-small, especially if we are applying the tool or concept in a typical way.
But definitely give it some times to think of it upfront. Weight them by 2 factor: severity and frequency. Then try to tackle as much as you can from the top list of severity * frequency level.
Definitely agree it sounds like good system design. I think the overlap is the big picture thinking. It provides a way of re-framing goals to give a clearer picture of the most important things.
So not just list everything that could go wrong, but maybe: what's a terrible day for your service/system that's most likely to happen? Cascading failures? Outage that makes accessing/recovering your system impossible? Backups unusable?
I don't think the value of inverse reasoning is that it lets you think of new ideas. If you're trying to list all the ways something could go wrong, you're pretty much just listing the ways they could go right, and flipping them. What I mean is, I think your list of bullet points for "how to have a great career" would be able as long as your list of bullet points for "how to have a terrible career". Both lists would usually contain the same essential information, just framed differently.
Where I think such a practice can be useful is in forcing you to confront unpleasant possibilities you would otherwise try to ignore, and thus at least briefly plan for them.
An interesting concept. If you want to succeed but cannot see the solution, spend a lot of time thinking about how you might fail, then just don't do that.
The only problem I see with this is the potential ways to fail are unbounded, maybe possibly infinite. While the ways to succeed may also be unbounded its growth metric is much smaller. That is, some infinite sets are larger than others.
A bit stronger: Think about all the ways you could fail, then make sure those things don't happen (to the extent you can).
That is, failure isn't always caused by me doing something. Sometimes it's caused by something external. My wonderful new box might not ship if we can't get the chips to make it? I should probably look at lining up a second source. (Yeah, a couple of years ago we saw that that may not work. You can't always prevent everything bad that can happen. You can prevent some of them, though, and it makes enough difference to be worth trying.)
Not only did I buy a Zune, I was an early enthusiastic customer of their streaming subscription service and thought that it would topple the iTunes sales model. We all know how well that went.
Same here. "Split brain," logical, analytical person. We aren't even a rounding error in market numbers. If I like it it is probably the superior product providing the most value...rarely ever matters.
I see the value here being: when you can’t understand the causal relationships for success, use the causal relationships for failure that you can understand and then avoid specific failure modes.
“Friend gives generally bad advice” is not that kind of a clear causal relationship.
I’ve often thought in life that I’ve learned most of what I know by seeing examples of what not to do. If for no other reason than that there are many more people running around without a clue than truly wise individuals making well-reasoned decisions. At least that’s how it’s seemed IME; I’m sure in reality people are just doing the best they could and making decisions in the moment, but they often seem to be thinking in the extremely short term (and then sometimes, still wrong even for that).
I saw something like this for suicide. Dark, and fairly graphic. All the possible ways you might want to kill yourself, complete with descriptions of how it usually goes wrong and you'll just be maimed for life instead. The intent was clearly to convince people not to do it.
One of my favorite versions: when Steve Jobs immersed an early iPod in water to see if any bubbles came out, and therefore it could be made smaller/tighter.
In mathematical optimization theory, duality or the duality principle is the principle that optimization problems may be viewed from either of two perspectives, the primal problem or the dual problem. If the primal is a minimization problem then the dual is a maximization problem (and vice versa). Any feasible solution to the primal (minimization) problem is at least as large as any feasible solution to the dual (maximization) problem.
Same thing applies to figuring out what you want. First identify all the stuff that you don't want - particularly things you already have and are willing to get rid of. Getting rid of things really reinforces the notion that you don't want them. This also makes space and frees up time to explore things you do want even if you're not sure what that is yet.
I feel like there is/should be some kind of programming language design equivalent of this. If you wanted to make programmers basically incapable of writing correct programs, how would you do it?
Here's where I'd share examples and it wouldn't be funny and instead start a flamewar.
https://en.wikipedia.org/wiki/Survivorship_bias