Aviation Safety and Confirmation Bias
Confirmation bias is one of the most insidious cognitive biases we face. It is all over our social media, our polarized debate style. But when it crawls into the cockpit it needs to be unwelcome and quickly. When I first came across Fred George’s piece, “Cognitive Biases Can Cause Real Trouble” in March of 2016, I filed it away as great future fodder for the safety series.
As pilots, we have opinions, beliefs, and more than a few of us cling to a mental model of the way things should be.
When not flying, such determination can be a good source of debate. When flying, however, one particular cognitive bias – the confirmation bias is a fairly insidious critter we should watch for.
“How finite is human memory? While the human brain is capable of 10 quadrillion processes per second, far more than any computer yet designed, our accessible memory actually is far less capable.” —Aviation Week and Space Technology, March 2016
Looking at the quote above, you might imagine how we’d build a toolset to survive. Since we can’t retain, access and remember (everything), it behooves us to develop methods to protect ourselves from cognitive biases.
To be fair… I’m no Tony Robbins. And doing the research for this article revealed something obvious:
When you work in private, commercial and business aviation, you have a lot of contact with leaders. That contact revealed that flying airplanes and running corporations can have similarities:
Our blindness as pilots, commanders and CEOs can lead to the same type of critical problems.
Shem Malmquist, a senior MD11 captain for a major U.S. freight carrier (and a Fellow of the Royal Aeronautical Society) identified cognitive biases that are most likely to affect safety of flight. Of the 12 that he identified – 4 are obvious to us aviators who have fallen victim to things such as “get home-itis” and similar entrenchments.
These four are worth mentioning, since their presence is found at the scene of most accidents.
As a pilot or CEO consider the power of that statement. Making a bad call hurts people.
Four principles known well to aviators are:
Attentional Tunneling: This occurs when the crew focuses excessive attention on one item to the detriment of being aware of other stuff. Think Eastern Airlines 401. The entire crew obsessed about a burnt out landing gear light.
Plan Continuation Bias: Pilots reading this will know this as a version of “get home-itis.” It is also framed by Malmquist as “crews also may be biased toward continuing missions because of external factors, notably their passengers’ expectations.”
Anchoring Bias: AKA focalism: This cognitive bias causes pilots to rely excessively on the first bit of info provided to them. This can form an anchor for making decisions. The anchor is so powerful that there is a reluctance to deviate to assure adequate safety margins. You could argue that anchoring is the gateway drug to confirmation bias.
Confirmation Bias: The sneakiest of the top four, might be the confirmation bias. This one is a bit more subtle and dark since it tinkers with our operating system of how we see the world with multiple sources of input. Our brain, for whatever reason, choses to listen to some of the evidence, weight it selectively, and make horrible decisions accordingly.
In the spirit of Dale Carnegie, we’ll focus on my lucky youth as a pilot and general shortcomings as a person. As a flawed human being, lucky pilot and student of our mind, it is my duty to learn from surviving my own wanderings into dangerous terrain.
Flaws, short comings, or just outright blunders bring people together in a non-judgemental framework. And therein lies the key – “non” – before the word judgemental. Nobody likes to be judged. And if you are trying to impart wisdom – you might rethink the wisdom of leading with criticism. Even your dog can corroborate this univeral truth.
Much of what I’d like to impart is also fun because I still maintain an airline transport pilot license. I still fly professionally and in this arena, I’m fortunate to interact, observe and learn.
Surviving Confirmation Bias
A seemingly well put together captain will fall prey to Confirmation Bias. If such a Captain isn’t getting their exercise, doing their yoga, eating right, or just generally working on self awareness, the door is open to problems. Health has an impact on mental clarity, and mental clarity / health has an impact on safety.
I used to keep a Cessna 185 at farm in Ontario, just over the border from Massena, NY. One of the pitfalls of keeping an aircraft outside is that bugs and birds are always looking for better accommodation. Nothing beats the warmth of an engine cowling for cover, nesting and breeding.
I have since survived the “WTF is that smell?” ordeal.
(It was a bird’s nest igniting inside the engine compartment.)
A wiser, calmer, more centered person would have conducted a more thorough preflight inspection of the aircraft. Better yet, they’d get a set of cowl plugs and wouldn’t rush a preflight. This is largely why the entrepreneurial types, doctors, lawyers, etc. shouldn’t fly.
So… post bug residency, I fired up the 185 and took off one day… not knowing that insects, insect larvae and a nest could actually kill an airplane.
My airspeed seemed a bit higher than normal after take off though I had a normal climb attitude. (The angle at which the nose is lifted above the horizon.) There were trees to clear at the farm, so doing this right was important, yet somehow today seemed like a gift: This extra speed meant I could pull back a bit more, and trade it for more distance from me and the trees.
Note to reader: If you actually do have more speed, this is a very normal, good, and sane thing to do.
So I did.
As I pulled back to climb more aggressively I noticed something was missing. When you are going 70 or 100 mph… you can typically *hear* the wind on the airframe. The “slipstream” associated with going faster. It wasn’t there. I was experiencing an inner conflict … 1 + 1 did not equal 2 and this is typically a terrible feeling as a pilot.
Yet the airspeed indicator, confidently showed 90 kts – plenty of room to pull back more…. and yet I wasn’t feeling good about all this.
Without enough time to actually get a chance to process creeping feeling of panic, the stall horn began to sound. In the 185 the stall horn is akin to a bottle whistle. If you’ve ever made an empty bottle whistle by gently blowing into the top of it – you know the simplicity of the Cessna stall horn on the 185. Short version: You have to be going really slow for that to sound. This is a scary feeling close to the ground.
An aerodynamic stall (the wing stops flying) is quite dangerous, since it means the wing is beginning to go from “flying” to “not flying.”
This is also known as falling like a brick since your wings are on vacation.
Reflexively, I pushed down, and after my stomach had finished its excursion into my throat, I realized that we were “flying” again. The trees were close, but not slapping the big bush tires of the 185, but they weren’t far away.
I had been deceived. I grabbed a PostIt note and covered the airspeed indicator. I would no longer suffer being lied to.
I had learned something in seconds:
Everything else disagreed with the airspeed indicator.
By a jury of its peers and me as judge, I covered it and elected to go back and land using RPM (engine condition / output), attitude as general guides to whether I was flying “ok” without any real airspeed information.
Confirmation bias is interpreting new evidence as confirmation of your beliefs or theories. I had plenty of reasons to believe the highly trusted airspeed indicator.
I had taken off, I had plenty of power, I had a reliable airplane that had never let me down. A classic set up.
Getting Bitten by Confirmation Bias
Once Confirmation Bias sets in, an aircraft and its crew may begin wandering towards an accident. The trick is to have a crew, tools and default behaviors that make it possible to get off any path confirmation bias leads you down.
If you don’t know the airspeed gauge is lying, and if you make decisions accordingly, you wander deep into danger – quickly.
A less lucky example a blocked airspeed indicator with disastrous results was NWA Flight 6231.
As the aircraft climbed the crew began to paint a mental picture of perhaps why the airplane was going so fast (hint: it wasn’t).
“It’s because we are so light.” (They were repositioning empty to pick up the Baltimore Colts and didn’t often get to fly this big powerful airplane empty.)
As the aircraft climbed, the speed increased. (The airspeed indicator had now become an altimeter. As the air expanded inside the blocked mechanism the pressure in the system increased. Not with speed, but with altitude. There was no longer a valid speed reference.)
The crew began to form a more solid mental picture with the captain’s lead.
Below is a transcript from the cockpit voice recorder – where the first officer is flying and the captain is monitoring:
Captain: “Would you believe that number.”
[Referencing the airspeed.]
First Officer: “I believe it, I just can’t do anything about it.”
[He had a mental picture of what type of pitch attitude he expected.]
Captain: “No, just pull her back, let her climb.”
[The Captain overrode this instinct and thought – heck, why not trade this {non-existent} airspeed for alitude.]
Most experienced paper airplane flyers know what happens next. Pitch up for too long, with inadequate power, and you will experience an aerodynamic stall. That’s when your paper airplane’s wing drops and it recovers by flying nose down and accelerating. Enough of that downhill stuff… and its flying again. (This is the noticeable break your paper airplanes makes after its first “pause” in the air. That pause *is* the moment of stall. It ran out of energy (airspeed in a nose up profile) after you’ve released from your hand launch.
This crew couldn’t recover since they were likely dedicated to believing the erroneous speed reading. They impacted the ground 83 seconds later from 23,000 feet.
Stall recovery in a jet is a tough affair, hence lots of training of not getting near it.
Learning
The best a fallible human can hope for is applying more mindfulness for the next time. The moment you face a personality that knows it all and assures you they have made all the mistakes there are to make (and dispenses orders and wisdom accordingly), take note. The leader you want is surely confident, but not so much they don’t continually reevaluate all the information they are working with, where it comes from and why it might be saying what it could be saying.
##
Image thanks to jamesclear.com.