This blog has focused much on the technical side of aviation. One of the biggest drivers in civil aviation is passenger safety and the last 40 years have brought tremendous advances on this front, with aviation now being the safest mode of transport. A lot of this has to do with the deep understanding engineers have about the strength of materials (static failure, fatigue and stability), the complexities of airflow (eg. stall), aeroelastic interaction (eg. flutter and divergence) and the control of aircraft. Furthermore, appropriate systems have been put in place do deal with uncertainty and monitoring the structural health of aircraft.

Anyone who has been inside a commercial aircraft cockpit can appreciate the technology that goes into controlling a jumbo jet. The amount of switches, levers and lights is mind-boggling. A big part of the high-tech that goes into commercial aircraft are automated control systems that keep the aircraft up in the air and automate parts of flight that require little input from pilots (eg. cruising at altitude). One could argue that human beings are fallible systems and therefore we should relinquish as much control as possible to automated computer systems. Get the computer to do everything it can and only allow humans to intervene in situations that require human judgement. In short if it’s technically possible, let’s automate.

Complexity in the cockpit

Complexity in the cockpit

The problem with this argument is that automating a process does not completely remove humans from the picture. If any form of human interaction is required at some point, the pilot still needs to be vigilant at all times in order to be ready to act swiftly when needed. Only focusing on automation and forgetting about the human-system interaction is bound to get us into trouble. This is a great risk of modern day specialisation. Focusing solely on your niche of the problem and forgetting factors from other scientific disciplines – “For a man with a hammer, everything looks like a nail”.

So, we require more than a hammer in our toolbox. Until we have automated the whole flight envelope to statistical perfection we need to be thinking about the way that systems and humans interact in the cockpit. Guaranteeing infallibility of the technical side is not enough. In fact, the aerospace industry was one of the first to introduce checklists into cockpits that are used to guide the pilots through specific manoeuvres and prevent avoidable mistakes and procedures that are easily overlooked or forgotten under pressure. It is incredible how successful you can be by continuously trying not to be stupid. The checklist system has worked so well that it is now being used in hospitals with amazing results. In the same manner the interaction between machine and humans has a lot to do with human psychology. As engineers we are generally aware of ergonomic design in order to create functional and user-friendly products. I have yet to see a university course that teaches the psychology of automation or human misjudgement in general to engineering students. 

However, it is not hard to imagine what automation can do to our brains. For anybody that uses cruise control in their cars, are you more or less likely to remain vigilant once the cruise control is set and you’ve taken the foot off the accelerator? I think it’s fair to say that most people will lose focus on what’s happening on the road once they are less engaged. In this way the risk in automation is that it can lead to boredom and loss of attention to detail. This is especially dangerous if we have been lulled into a sense of false comfort and start relinquishing all control in the belief that the system will take care of everything.

Now why am I bringing this up? Because for exactly these reasons Flight 3407 lost control (aerodynamic stall) and crashed in 2009, killing everyone on board. According to the National Transportation Safety Board the likely cause of the accident were, “(1) the flight crew’s failure to monitor airspeed in relation to the rising position of the low-speed cue, (2) the flight crew’s failure to adhere to sterile cockpit procedures, (3) the captain’s failure to effectively manage the flight, and (4) Colgan Air’s inadequate procedures for airspeed selection and management during approaches in icing conditions. [1]” Apart from the fourth reason everything suggests a simple failure to pay attention. The pilot had not noticed that the airplane lost air speed during automated decent. Upon being alerted by the stick shaker, an anti-stall system, he inadvertently pulled the shaker in the wrong direction thereby further reducing airspeed and stalling the plane from it could not recover. In fact, a 1994 National Transportation Safety Board review of thirty-seven accidents involving airline crews found that in 84% of the cases inadequate monitoring of controls was a contributing factor.

There is a lot to learn from these failures and given the excellent track record of the aviation industry these findings will undoubtedly lead to better procedures. However, apart from better procedures we also need to holistically educate the engineers of tomorrow to look past purely technical design and incorporate research from psychology. Research into how this is best achieved is currently ongoing but for now there is something we can all take away from this: don’t simply automate something because we can, but because we should.

References

[1] http://www.ntsb.gov/doclib/reports/2010/aar1001.pdf

 

Tagged with:
 

One Response to Human Fallibility in Aviation

  1. Ben Mathew James says:

    A well written blog which peaks into the downsides of automation. You are spot on with each and every point. Automation does not mean we can rely on it completely. Humans still has to be vigilant. At the end of the day its the pilot who has to take those critical logical decisions. Pilots has to be ready for the moment rather than banking on the luxury automation provides. Finally as you said, don’t simply automate something because we can, but because we should.

Leave a Reply to Ben Mathew James Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Want to learn more about aerospace engineering?

Then give our email list a try!
We’ll send you one email a month with a digest of the newest posts from us and interesting aerospace articles from around the web.