Look for the Hidden Assumptions

What can we learn from bullet holes on WWII airplanes about hidden assumptions in order to get to better decisions.

Hidden Assumptions

Today I am going to share a story about hidden assumptions. I am reading another book about mental models (“How Not to Be Wrong” by Jordan Ellenberg – link to Goodreads) and there is one example which stood out, especially since I had already read about it multiple times. The example is about the mathematician Abraham Wald, who was helping the United States Air Force (USAF) make better decisions during WWII.

When I need to make a decision, I usually find it very easy to get all the known knowns and known unknowns and put them on the table. But all these are explicit assumptions. For example, I recently changed teams within my organization twice in rapid succession. In both cases, I considered the immediate assumptions. I expected the team I was leaving to keep getting worse and I expected the team I was joining to be better than my previous team and to keep getting better. But I kept asking myself, “Was I wrong?”

Here, I will share with you the story about the bullet holes in the USAF airplanes and how Abraham Wald had the insight in correcting a wrong assumption which could have costed many lives.

Overview

Hidden Assumptions are assumptions which influence the outcome of a decision, but you have not taken them into account. If you want to be wrong less (and this is the best you can do, you cannot get to not being wrong at all), you need to make sure and uncover these assumptions.

What is the relationship with WWII?

The example which Jordan Ellenberg shared in his book was really compelling. You see, military airplanes need to strike a balance between the armour they carry and their weight. If you put too much armour, you will make the plane less maneuverable (and waste fuel), but if you put too little, it is going to get shot down. So the military came to the Statistical Research Group (SRG), where Wald was working, and showed them data about the bullet holes per square foot on the airplanes. They argued that if they put more armour on the areas with the most holes, this is going to improve the survivability of the planes.

But then Wald came in and asked a simple question: “Where are missing holes?” The military could only share the data about the airplanes which made it safe back. But this is survivorship bias. Wald argued that the missing airplanes, those that got shot down, had a completely different set of bullet holes. So, instead of fortifying the areas with the most holes on the survived planes (fuselage and wings), Wald proposed fortifying the areas with the most holes on the missing airplanes (engine and fuel tank).

How can we take advantage of this story in our lives?

3 steps to get bet at detecting the hidden assumptions

First of all, in any situation you can pause before making a decision and ask yourself:

  • What is the data I know not showing me?
  • What are the assumptions on which the data is based?
  • Are these assumptions true?

In the example above, the hidden assumption was that the airplanes in the data were a random sample of all airplanes. But this turned out wrong.

Then, you can look for the missing piece. If the original assumptions are not true, then how can you generate a new set of assumptions? This can be a fun plan.

Finally, you run the same test using the new assumptions. Are they true?

Three steps to uncover hidden assumptions.

One action

Without looking for the hidden assumptions you will keep making decisions which you think are data-driven, but are not, because they are not using the whole data.

Next time, when you need to make a decision, ask yourself one question: What am I not seeing?

Next steps

What are the next steps?

If you have liked my article, please proceed to my contact page, where you can view various ways to contact me.

Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.