The Power of Rethinking Problems

Every week was a recurring disaster.

Aircrews stood on runways, watching fleets of B-17s limp back to base. Some were smoking and on fire. Others had chunks missing. Many never returned.

It was the early 1940s and Allied air missions weren’t going to plan. 

Too many bombers were being shot down, sent screaming to the ground with soldiers trapped inside.

diagram of plane shot down
via Wikimedia

 

Identifying the problem

Allied leaders were very competent in their own right. But struggled to find the right solution. 

Military circles were prone to monolithic thinking. Stepping outside of normal thinking would send people’s nerves on edge. But officials knew something had to give. People’s lives rested in the hands of how these changes played out and there wasn’t much time to work with.

They hired Abraham Wald. He was identified as a genius at an early age. 

He had a Ph.D. in mathematics and was trained by Karl Menger himself (the man who wrote Menger’s theorem and many other lessons from your geometry class). 

Abraham was ousted from Austria for being a Jew. In return, he joined the long tradition of brilliant Jewish scientists lending their ability to the Allied army.  

He was assigned to a special military department at Columbia University.

Wald’s job was to solve complex, consequential problems that stumped the Army’s brightest members. 

In many ways, he was the military’s Tier-1000 help desk. He was a foreign national and wasn’t supposed to have access to sensitive data. They’d let him in simply because of how valuable he was. They knew Wald had a personal vendetta in this fight and the cognitive means of acting upon it. 

Leaders isolated and presented Wald with a very specific challenge, “How can we armor B-17s in order for them to survive more missions?”

 

Wald found a path to a solution

The central constraint was that weight mattered: they couldn’t just double the armor on each plane. A flying tank would be fantastical. Planes still needed to carry bombs, have range, and be maneuverable.

First, Wald asked for all repair data on all the B-17s that returned from missions. Most returned riddled with bullet holes. Flight crews then documented damage before repairs began.

The reports piled up on Wald’s desk. He had his team document all damage onto a consistent grid for proper referencing. This allowed them to compare apples to apples rather than apples to oranges.

The military engineers looked at the charts and argued about where to place the armor: on the outer or inner wings. 

If you look at the grids, it seemed like the obvious solution: add armor where the plane is taking lots of damage. 

Wald felt the tingle of doubt as he scanned the data. Then he pushed back on that plan. His intuition sensed a flaw in the statistics. 

Wald excelled in the abstract and began asking fundamental questions about the data. He eventually asked a breakthrough question: “Where are the missing holes?”

Allies were using damage reports only from planes that returned. The planes that did return seemed to have consistent areas where they hadn’t taken damage. 

He concluded there were no holes in the other spots because those planes hadn’t made it back. The data was showing that they had non-critical damage. Put more plainly, the military should armor the returning planes where they took the least damage.

The spots considered critical:

  • The underside of the cockpit because the pilots were killed.
  • The fuel tank, for obvious reasons.
  • The middle of the wings where tension was highest.

By armoring those areas, survival rates increased by an order of magnitude. Those critical areas deflected more shrapnel and bullets. More men were returned home to their mothers.

And it happened because the military hired a bright academic, who knew how to ask the right questions.

 

Don’t fall into this trap when attempting to solve problems

Dr. Wald’s insight revolutionized military operations research and how military leaders studied field data. This case study is also a perfect example of survivorship bias, which is our inclination to attribute conclusions based on successful data to the exclusion of failure data.

Survivorship bias happens when we treat successful startup owners like holy figures, who drop unquestionable wisdom, while not examining why the other thousand startups failed.

A more relevant example happened in WW1. New helmets were released and there was an uptick in helmet injuries. Leaders erroneously thought the helmets were bad when it was quite the opposite. More soldiers were just surviving the bullets hitting their helmets.

There is a key takeaway we can learn from Wald. With any decision using data, make sure to examine your assumptions. Data is always limited by the way we view the data.

When we wonder how to solve problems, there are often obvious solutions. Sometimes the problems we get stuck on have deceptively easy solutions that we fail to look at. Strip your problem down to its most basic elements. 

Then identify its mechanics and begin assembling possible solutions. Very often, you’ll laugh with frustration at how easy the answer really was.

The brightest minds excel at breaking a big problem down into lots of small problems.


Related:

Problem Solving Techniques blog post