You may not know the name Abraham Wald, but he has a very valuable lesson you can apply to problem solving, engineering, and many other parts of life. Wald worked for the Statistical Research Group (SRG) during World War II. This was part of a top secret organization in the United States that applied elite mathematical talent to help the allies win the war. Near Columbia University, mathematicians and computers — the human kind — worked on problems ranging from how to keep an enemy plane under fire longer to optimal bombing patterns.
One of Wald’s ways to approach problem was to look beyond the data in front of him. He was looking for things that weren’t there, using their absence as an additional data point. It is easy to critique things that are present but incorrect. It is harder to see things that are missing. But the end results of this technique were profound and present an object lesson we can still draw from today.
Are You Measuring the Wrong Samples?
A problem was posed to the SRG — too many planes weren’t returning from missions. War planes are, of course, the product of engineering trades. Given a certain engine, you can fly so much weight. You divide that weight among crew, equipment, armor, and ordnance. The more you add of one thing, the more you take away from other things. The problem was that Allied planes were being shot down and needed more armor. But more armor meant fewer bombs. The Statistical Research Group’s task was to figure out how to best protect the planes without unnecessarily impacting the other factors.
The Army Air Corps noticed that after a mission, bullet holes were not distributed uniformly across the aircraft. The fuselage took nearly 2 bullet holes per square foot on average, and the fuel system took almost as much. But the engines took just a bit more than one bullet per square foot. The Army wanted to know how much more armor to put on the parts of the plane that were taking the most bullets.
Wald had a different point of view. Instead of putting armor on the fuselage, Wald wanted to add armor to the engines, even though they appeared to be taking fewer hits. Why? Because the samples the Army measured were from planes that returned. Wald surmised that the planes with many bullet holes to the engines were not coming home. The extra armor belonged not on the part of the plane that could survive a lot of bullets, but to the part of the plane that couldn’t.
As a result, extra engine armor appeared on warplanes from that point forward. Once you understand the logic, Wald’s insight seems obvious. But it defies many people’s idea of common sense to protect the part of the plane taking the least damage.
In science and engineering, you have to question your assumptions and the assumptions of others. Things that seem obvious are often wrong. Heavy objects don’t fall faster (with all other things being equal). The Earth isn’t flat despite casual observation from the ground. Continents really do move, but not on a time scale we notice.
Identify Your Assumptions
How often do assumptions bite us when working on hardware or software? All the time. The second hardest bugs to fix are the ones where we’ve made a wrong assumption. For example, you feel sure an input pin is pulled up internally, but it turns out it isn’t. Perhaps you are sure the compiler zeros variables for you and, but it turns out it doesn’t. The hardest false assumptions to spot, by the way, are the ones where our abstractions are broken. Finding when a compiler generates incorrect machine code, for example, is very hard. That’s because we’re looking at it from a different level, and everything up there looks just fine. The same happens when you get a black market IC that performs in a similar way, but doesn’t quite meet the specs of a real part.
For software, hardware, and most other fields, it is much easier to look at what’s present and critique it than it is to decide what’s totally missing. In Wald’s case, everyone was looking at the hole densities without thinking why the density wasn’t uniform to start with
Abraham Wald’s Legacy
During his time with the SRG, Wald was not allowed to see the finished reports he contributed to. He was considered an enemy alien, having emigrated to the US from Austria during the war. He was the grandson of a rabbi, the son of a kosher baker, and fled Austria after its annexation by Nazi Germany. Despite not being able to acquire the security clearance, he was content to apply his considerable math expertise to help drive the Nazis out of Europe.
Ironically, Wald and his wife died in a plane crash on an Air India flight. They were survived by their son Robert Wald who went on to become an accomplished theoretical physicist.
There is some debate regarding the veracity of the story of Wald and the missing bullet holes. According to the American Mathematical Society, some parts of it are unverifiable. Wald did work for the SRG and is known for some fundamental techniques of operational research, and it is known that the team worked on aircraft vulnerability. Several people familiar with the work have credited Wald, so I like to think the story is mostly true.
Wald may not have invented this technique, but his application of it was classic, and a lesson from which we can all learn.
No comments:
Post a Comment