Normalization of Deviance Considered Harmful

Twitter Updates


San Francisco PechaKucha Night
San Francisco IxDA


While reading Bruce Schneier’s CRYPTO-GRAM, I read a section about normalization of deviance, which referenced a horrible example of how much death and destruction can happen when people who think they’re too smart to follow rules and procedures designed by domain experts are allowed to continue being in charge.

In 2014, a Gulfstream IV crashed while attempting to take off from the  7,000 foot runway (with another 1,000 feet of overrun) at Laurence G. Hanscom Field (BED), Bedford, Massachusetts, killing all 7 people aboard, because the two pilots were dangerous idiots.

The pilot in command had 11,250 hours in his flight log, while the second in command had over 18,000 hours in his log, and both thought this meant they didn’t actually have to do a walk-around or do even one thing on the pre-flight checklist (which has five mandatory checks). The only pre-flight work they did inside the FBO before taking off was to order a pizza, take it into the plane, eat it, and then take the empty box back into the FBO. Then they:

  1. Forgot to disengage the gust lock, which meant none of the flight controls could move.
  2. Strong-armed the throttles past the gust lock, probably breaking part of it, and were thus unable to advance the levers to takeoff power (but they started rolling anyway).
  3. When they first noticed a problem, they had 5,000 feet of runway left—plenty of space to safely abort.
  4. Then they disengaged the flight control’s hydraulic system.
  5. When they had only 1,500 feet of runway left, they tried to deploy the spoilers, which would have significantly improved breaking, but alas, the hydraulics needed to push them up into the airstream were still off.
  6. So the plane crashed through the runway lights, slammed into a ditch, broke in two, and burst into a fire that was not survivable.

All the follow-up interviews with other pilots that had worked with the pair confirmed that this failure to do the checklist wasn’t a one-time or occasional goof—they never performed a pre-flight checklist. The above article states that “Gulfstream pilot and Code7700 author James Albright calls the crash involuntary manslaughter.” Albright says “These pilots were experts at deceiving others…”.

This is a humongous problem (not just in aviation—everywhere), in our society, and we continue to ignore it at our peril. It doesn’t matter how long someone has been in a position they’re dangerously incompetent at (even if they have passed all of the existing qualification tests), they need to be fired and replaced with someone who isn’t afraid of good rules and learning.

Yet this is a very difficult thing to figure out, because among all of the good, sane rules that people should follow, there are littered bad, evil rules that competent people know they must break in order to do good things. How you determine the difference between the good and bad rule breakers is a topic for another day, but the first step in getting there is to immerse yourself in as many worst-case scenarios such as this one.

The reason for the immersion is simple—you need to be able to spot such frauds from their behavior and the language they use, and their attitudes.

Author: Peter Sheerin

Peter Sheerin is best known for the decade he spent as the Technical Editor of CADENCE magazine, where he was the acknowledged expert in Computer-Aided Design hardware and software. He has a long-standing passion for improving usability of software, hardware, and everyday objects that is always interwoven in his articles. Peter is available for freelance technical writing and product reviews, and is exploring career opportunities in interaction design. His pet personal project is exploring the best ways to harmonize visual, tactile, and audible symbols for improving the effectiveness of alerting systems.

Leave a Reply