Monday, November 25, 2013

Do Pilots Rely On Automation Enough?

By now everyone in the commercial aviation world knows about the Boeing 747 Dreamlifter landing at the wrong airport in Kansas.  This aircraft is modified to fly large sections of the 787 Dreamliner to facilities where portions of the manufacturing process are performed.  This modified aircraft does, however, have the standard FMS flight guidance systems of a typical B747-400.  There were no injuries to the cargo or crew outside of embarrassment.  Embarrassment is that feeling we get when we have remorse for our actions.   The more public the mistake the more the embarrassment.  Usually there is the obligatory “I’ll never do that again.”

It is ironic that in the midst of an industry wide discussion about commercial pilots’ deterioration of manual flying skills we have an incident that could have been prevented by less manual flying.  It is very hard to fly an instrument approach to the wrong airport.  Landing on the wrong runway maybe, but not the wrong airport.  There were many clues inside the cockpit that could or would have shown the pilots they were not where they thought they were.  Sadly, many pilots crashed in the early days of instrument flying because they did not use or believe the information that was available to them.


This is the lesson that we should all learn from this crew’s embarrassment, “We are all capable of landing at the wrong airport.”  It is not because we are bad pilots.  It is not because we are too reliant on automation.  It is not because we do not have enough information to verify our position.  It is not because the ground proximity warning system is inadequate.  It is not because we have insufficient procedural guidance.  It is because we lack identification with those pilots who have made these mistakes before.  We think we are better than that.  The biggest threat we all face in aviation is the feeling that it won’t happen to us.

2 comments:

  1. Embarrassment is a small price to pay as in this case--but not so much when lives are lost.

    The lesson we should learn from this is that the inability to learn from others’ mistakes (read hubris) requires luck to keep us out of harm’s way. Who wants to rely on luck to be the saving force in any risky endeavor?

    Hubris leads to disregarding information. It leads to too much reliance on automation. It leads to disregarding warning systems. It leads to procedural non-compliance. It leads to bad outcomes…if only embarrassment.

    Every one of us has made this mistake, we were just lucky enough to avoid the bad outcome.

    Embarrassing, isn’t it?

    ReplyDelete