Wednesday, June 10, 2015

Survivorship bias

Original post:  Jun 12, 2013

My sons enjoy learning about World War II. We've been watching a video series they brought home from the library that discusses many of the weapons from that era. They especially enjoy the combat airplanes. It was amazing to hear about the tales of the bombers surviving incredible damage and limping home.

Recently, I stumbled onto an article that discussed how an effort to protect the bombers actually highlighted an interesting fallacy that the author termed as "survivorship bias". I found the article very thought-provoking.

The author defines survivorship bias as follows:

The Misconception: You should study the successful if you wish to become successful.
The Truth: When failure becomes invisible, the difference between failure and success may also become invisible.

In WW II, the Department of War devoted resources to using mathematics to help the war effort. Since bombers were highly vulnerable, they decided to try to improve the odds of helping the crews return safely by strengthening the armor on the planes.

B-17_Damage_Cologne.jpgb17hit.jpg

"The military looked at the bombers that had returned from enemy territory. They recorded where those planes had taken the most damage. Over and over again, they saw the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner. Considering this information, where would you put the extra armor? Naturally, the commanders wanted to put the thicker protection where they could clearly see the most damage, where the holes clustered. But Wald said no, that would be precisely the wrong decision. Putting the armor there wouldn’t improve their chances at all."

"Do you understand why it was a foolish idea? The mistake, which Wald saw instantly, was that the holes showed where the planes were strongest. The holes showed where a bomber could be shot and still survive the flight home, Wald explained. After all, here they were, holes and all. It was the planes that weren’t there that needed extra protection, and they had needed it in places that these planes had not. The holes in the surviving planes actually revealed the locations that needed the least additional armor. Look at where the survivors are unharmed, he said, and that’s where these bombers are most vulnerable; that’s where the planes that didn’t make it back were hit."

There may be some situations where we really need to think deeply about our issues. The answer may not be as obvious as it seems to be!

The author continues:

"Simply put, survivorship bias is your tendency to focus on survivors instead of whatever you would call a non-survivor depending on the situation. Sometimes that means you tend to focus on the living instead of the dead, or on winners instead of losers, or on successes instead of failures. In Wald’s problem, the military focused on the planes that made it home and almost made a terrible decision because they ignored the ones that got shot down."

As the psychologist Daniel Kahneman writes in his book Thinking Fast and Slow, “A stupid decision that works out well becomes a brilliant decision in hindsight.” The things a great company like Microsoft or Google or Apple did right are like the planes with bullet holes in the wings. The companies that burned all the way to the ground after taking massive damage fade from memory. Before you emulate the history of a famous company, Kahneman says, you should imagine going back in time when that company was just getting by and ask yourself if the outcome of its decisions were in any way predictable. If not, you are probably seeing patterns in hindsight where there was only chaos in the moment. He sums it up like so, “If you group successes together and look for what makes them similar, the only real answer will be luck.”

Here is the link to the full article:  http://youarenotsosmart.com/2013/05/23/survivorship-bias/

No comments:

Post a Comment