Wednesday, January 29, 2014

I Learned About Flying From That

I was a senior in college when I first thought seriously of making aviation a career.  Like many before me, the first thing I did was to buy a copy of FLYING magazine.  There were lots of advertisements for flying schools and stories about all different types of airplanes and their avionics.  However, the article I enjoyed the most was, and still is, “I Learned About Flying From That”.  I am not sure why I found it so interesting, but I couldn’t get enough of them.  Before I had even taken my first flying lesson, I had read many of the articles that shared the stories of pilots who told the reader what they had learned from their “experience”.

I guess it is no surprise that after earning a private pilot’s license, flying in the military and over 35 years as an airline pilot, I am still fascinated by pilots telling their stories.  I especially like hearing pilots’ self-critique of what they thought worked out well and what did not.  I have used the information from those stories countless times in my career to keep me out of trouble or elevate my flying skill.  I think many other pilots, private and professional alike; find the narratives have tremendous value.

I recently wondered when the first installment of “I Learned About Flying From That” appeared in FLYING.  The magazine was originally entitled POPULAR AVIATION, and was first published in November 1927.   However, “I Learned About Flying From That – No. 1” first appeared in the May 1939 issue. The publishers included a preface under the title of this first article.  “This is the story of a pilot who had a harrowing experience that taught him a lesson.  It is our hope that other airmen will profit by his mistake.”   The first “I Leaned About Flying From That” was a story written about a rescue mission to Alaska in a Ford Tri Motor.  The author, Garland Lincoln, relates the circumstances, including the weather and his decision making process, that allowed their flight to end with the Tri Motor upside down in the mucky tundra.  Garland writes, “The lesson I learned?  That, whether or not lives are at stake, taking chances is silly.” 

The Aviation SafetyAction Program (ASAP) was initiated and authorized by the FAA for the same purpose.  It was intended to be a venue “to enhance aviation safety through the prevention of accidents and incidents.  Its focus is to encourage the voluntary reporting of safety issues that come to the attention of employees and certain certificate holders.”

It is unfortunate that ASAP has not been more successful sharing the actual experiences of airline pilots’ lessons learned in real life situations.  Each one is another “I Learned About Flying From That”.  Each one includes a narrative of the situation that initiated the report and then there is an opportunity for the pilot to self evaluate.  How would this situation be handled differently if it were encountered again?  What advice would be given to others to better prepare them for a similar situation?   Professional pilots are always interested to hear advice from their peers that will keep them out of trouble.

Currently the data is collected and analyzed and then distributed through standardized recommendations or via new or amended standard operating procedure.  I absolutely support the “just culture” that calls for immunity and anonymity unless there is reckless intent.  However, ASAP data is not just for safety managers. Once these reports have been de-identified and any retraining has been accomplished, the raw information needs the widest distribution.  It needs to be seen by line pilots as soon as possible.  The narratives would be seen by line pilots as essential reading.  Pilot associations as well should get behind the distribution of these reports not just protecting the anonymity of the data. 

What has changed since 1939?  Is aviation now so sophisticated that we don’t need to hear from pilots how they made their mistakes?  Will there ever be a time when we don’t need to be admonished by our peers not to take chances?  Will there ever be a time when we don’t need the advice of our fellow crew members?

Saturday, January 25, 2014

Not To Decide Is To Decide

In the most recent edition of Air Transport World magazine, Robert W. Moorman writes an excellent article reviewing the advances in technology that have improved flight safety.  The article ends with quotations from (FSF) Flight Safety Foundation’s CEO Kevin Hiatt.  Hiatt gives a statistic that most airline safety managers are aware of, but have not been able to change.  He said, “What we’ve discovered is that 96% of the approaches in the system are flown correctly.  But in the 4% that are not, we’re finding that the pilot is continuing to fly the approach, rather than initiate a go-around.”  In fact, some airlines have been able to increase the percentage of stable approaches, but not the percentage of unstable approaches that result in a go-around.

This conundrum has no hardware solution.  The statistics are solely dependent on decision-making.  There is not a device that can prevent a pilot from making a poor or ineffective decision.  Only training and experience can improve effective decision-making statistics.  By experience, I mean the collective experience of all pilots.  As the FSF has been doing for decades, data must be collected and shared among professional airmen for the purpose of collective knowledge.  What is not being widely done, however, is using this body of data to help pilots learn decision-making skills.

Decision-making (DM) is not the same as standard operating procedure (SOP).  In fact, it is just the opposite.   Decision-making is, by definition, a choice.  SOP does not rely on choice, rather strict obedience.   Imagine a spectrum with DM at one extreme and SOP at the other.  That spectrum defines the environment that pilots live in.  Contemporary training and proficiency standards for commercial pilots are biased very heavily to the SOP end of the spectrum.  Pilots are taught how to comply with SOP, but much less training is focused on how to remain within SOP or more importantly how to recover when a deviation from SOP has occurred.  It’s easy to label this area as intentional non-compliance, but that would be far too simplistic.

Pilot performance outside SOP is exactly the territory the FSF data describes.  When an unstable approach occurs, SOP is no longer controlling the outcome.  If it were, the approach would not be unstable or a go around would always be accomplished.  When outside SOP, decision-making will be the determining factor.  However, the data shows that pilots can be very ineffective when making these decisions.  The major approach and landing accidents from 2013 at SFO, LGA and BHM as well as two recent occurrences of landing at the wrong airport further support this position.

Decision-making is not simply a plot on a risk matrix.  It is a proactive and deliberative process that evaluates and matches choices to the existing or expected conditions.  The choice will most likely be dependent on the goal of the decision maker. If safety is perceived as the primary goal, landing will become subordinate.  Conversely, if landing is the goal, it will drive the choice selection.  In other words, “I am going to stay safe and land if it works out.” or “I am going to land and I think I can stay safe while I do that.”

Why have the hardware “safety enhancers” of airplanes been more successful than the pilots that fly them?  I believe it is because the developers of hardware devices accept failure as a possibility, whereas writers of SOP do not accept the reality of non-compliance whether or not intentional.  No component is ever installed on an aircraft without a tested and trained process in the event of it’s failure.  What is the process for failure of SOP?  Are we to expect that all pilots will follow all SOP all of the time?  If not, then what is the process for human failure?

Airline pilots spend hours and hours in the classroom and simulators learning procedures for both normal and non-normal situations.  They are carefully evaluated on their knowledge of what to do in the event of system failures.  They practice and debrief realistic scenarios over and over to be prepared for extremely rare events.  How much time is spent learning how to manage human failures?  I bet it’s pretty close to 4%.

Wednesday, January 15, 2014

W. T. F. ?


“The landing was uneventful, and all customers and crew are safe," airline spokesman Brad Hawkins said in a statement late Sunday.”

“Airline spokeswoman Brandy King said Monday that the captain and first officer were removed from flying duties while the airline and federal aviation safety officials investigate the mistake.”

Of course Jay Leno had some funny things to say about the mistake, but there is a very serious side to the incident.  These two statements from the same news report of SWA 4013 totally contradict the other. How can the flight be uneventful and the pilots taken off flying status for their mistake?  What was meant by spokesman Brad Hawkins “all customers and crew are safe”.  I guess that was because they were no longer on the airplane.  Before the airplane landed, “the outcome of the maneuver” and the safety of the customers and crew was clearly in doubt.  Fortunately luck overcame human failure.

Last November I posted an article, Do Pilots Rely OnAutomation Enough?, after the “Dreamlifter” 747 flown by Atlas pilots landed at the wrong airport in Kansas.  The anecdote for this or any other type of pilot error is the acknowledgement and ownership of the statement “I AM CAPABLE OF MAKING MISTAKES”.  Pilots without that mindset, no matter how proficient or experienced, will continue to fall prey to their human vulnerabilities.

I am beginning to believe that the culture of the airline industry in general is actively trying to avoid this truth.  I am very reticent to highlight specific operators.  However, in this case it is unavoidable.  I believe we are looking at the tip of an iceberg.  The unusual number of recent incidents at Southwest Airlines requires either an internal or external audit of their flight operation.  This audit is necessary to identify corporate attitudes and cultural attributes, both positive and negative, so that these lapses in safety can be addressed.

The audit needs to a Line Operational Safety Audit(LOSA)The goal of this audit should be targeted at the operations’ approach to safety, not just an audit of procedural compliance.  I am sure that Southwest, like all other airlines, has sufficient procedural guidance to enable its pilots to avoid their string of recent incidents.  So one must ask then, how does this continue to happen? And more importantly, how do we change the outcome?  The LOSACollaborative, under the direction of Dr. James Klinect, provides airlines with the tools and training necessary to audit their operation and assess the collected data.  It also allows for normalization and distribution of data between member airlines.  All aviation professionals have a vested interest in the pursuit of the highest level of aviation safety.

Nearly 20 years ago, a Continental Airlines DC-9 landed gear up at Bush Intercontinental Airport (IAH).  Even though no one received even a minor injury, the airline’s management took a courageous approach to that profoundly avoidable accident.   Instead of looking at the accident as just a failure of the pilots, they looked at the operation as a whole.  Just like the 737 crew landing at the wrong airport, the DC-9 pilots had SOP’s in place that would have prevented the gear up landing. 

The paradox was, and still is, “What is the procedure that ensures SOP’s will be followed?”.  How do crews escape the inevitability and consequences of human error?  Humans CAN and WILL make errors.  The only solution is to accept their existence. The only antidote is mitigation.  As humans we cannot successfully avoid or ignore errors.  They must be embraced and accepted as inescapable.  Unfortunately, pilots like other very proficient and highly motivated individuals are the least likely to accept their fallibility.

It took the leadership of Continental CEO Gordon Bethune, the commitment of flight standards and training, the guidance and research of Dr. Robert Helmreich and his team at the University of Texas, and the willingness of the line pilots to develop the safety management approach we know today as Threat and Error Management (TEM).   LOSA is an integral component of an effective TEM program.

I have asked this question many times, “Is the goal of airline operations safety or procedural compliance?”  Will procedural compliance guarantee safety?  Is having a published procedure a guarantee that pilot errors will be eliminated?  Is it just process or the achievement of an objective? 

I will agree that landing a 737-700 on a 3700’ runway is an impressive piece of airmanship.  However, like performing surgery on the wrong body part, doing the wrong thing well is still doing the wrong thing.  The last time I had surgery done the team in the operating room asked me a list of questions that ensured they were dong the right thing to the correct part on the intended individual.  Just requiring a list of questions, i.e. creating SOP, will not eliminate error.  To do that requires a mindset that includes the acknowledgement of error.  Not just error in general, but that each person involved might be the one to make the mistake.  The understanding that SOP is not the ultimate goal.  Rather, it is a tool to manage error. When that mindset exists, the individual and the organization are able to look at threats that exist in them or their environment that causes humans to make unintentional errors.

Will Southwest, or any other airline for that matter, have the courage to look internally and ask, “Do we train our pilots to manage inescapable error or do we just write procedures, expecting an unachievable error free execution of these standard operating procedures. 

Time will tell, but I am worried.

Tuesday, January 14, 2014

Its A Sin To Kill A Mockingbird


Harper Lee’s classic novel, To Kill A Mockingbird, is a giant in American literature.  The author uses not only the main characters to tell a provocative story, but allows the minor ones to add even greater depth to lessons the children, Jem, Scout and Dill, learn as their world grows beyond the innocence of their own backyard.  One such interaction takes place between Jem and their spinster neighbor Maudie.  Jem hears from Miss Maudie what he has probably felt for some time, that his father, Atticus, is different than other men in Maycomb County.

“I simply want to tell you that there are some men in this world who were born to do our unpleasant jobs for us. Your father’s one of
them.”

“Oh,” said Jem. “Well.”

“Don’t you oh well me, sir,” Miss Maudie replied, recognizing Jem’s
fatalistic noises, “you are not old enough to appreciate what I said.”

Jem was staring at his half-eaten cake. “It’s like bein’ a caterpillar in a cocoon, that’s what it is,” he said. “Like somethin’ asleep wrapped up in a warm place. I always thought Maycomb folks were the best folks in the world, least that’s what they seemed like.”

“We’re the safest folks in the world,” said Miss Maudie. “We’re so
rarely called on to be Christians, but when we are, we’ve got men like Atticus to go for us.

As I reach the sunset of my professional aviation career, this passage speaks so clearly to me the role airline Captains play in the thousands of flights that navigate the globe everyday. 

Just like the Finch children, Jem and Scout, today’s airline pilots exist in a protected environment that rarely requires them to make the hard choices.  Decision-making is mostly predestined in thick manuals of policy and procedure.  In rare cases, situations arise that require extraordinary leadership.  In the spirit of Atticus, some of the men who have been required to do the tough things have names like Haynes, de Crespigny and Sullenberger.  However, many others do not have names because their tough choices are never noticed. 

The difference between notoriety, infamy and obscurity can be very slim. Sometimes disaster is avoided by landing a crippled jetliner.  However, sometimes it is avoided by making a tough decision.  Landing on a slippery runway, descending into mountainous terrain at night, flying through thunderstorms or even flying a daytime visual approach can also be opportunities to avoid disaster. When things get rushed, confused or doubtful it takes courage to resist momentum and break the unfolding chain of events.  So many iconic air crashes might never have happened.  So many lives could have been saved, but for making the hard choices.  To abandon an approach or divert, especially while others are landing, is not an easy decision.  Most likely it will be met with opposition or negative critique.  Captains who shoulder this responsibility every day, who make the tough choices, will always remain nameless. There are no accolades or news headlines for a safe arrival. The simple peace and confidence of doing the right things to be safe is its own reward.