How Cockpit Computers Prejudice Pilots’ Performance

Mental processes still important in autopilot age.
Image
Cockpit showing autopilot.
Media credits
Peter Gwynne, Contributor

(Inside Science) – The increasing amount of automation in airliners’ cockpits has simplified the job of piloting. Automatic systems are particularly desirable during long cruise phases at steady speeds and altitudes.

But they may have a dark side. A new study indicates that pilots who rely too much on cockpit automation can lose the critical thinking skills that make them able to adapt to unexpected situations.

“Automation has created new opportunities for mistakes to be made, by pilots who don’t understand what the machine is doing and are not necessarily paying attention,” said Stephen Casner, a research psychologist at NASA’s Ames Research Center in Moffett Field, California, who headed the project.

The study differentiates between the manual skills that pilots use to operate airplanes’ controls and cognitive abilities that they need for such tasks as troubleshooting and maintaining awareness of their planes’ position.

“Hand-eye skills are implicit skills that we learn by simply doing them over and over – the no-brainers like tying shoes or riding a bike; once you learn them well they tend to stick around,” Casner said. “Cognitive skill requires you to stop and think about the situation and your prior experiences; it’s more cerebral.”

Casner’s team reports its conclusion in the journal Human Factors.

To explore the difficulty, Casner’s team recruited 16 airline pilots to fly routine and non-routine flights in a Boeing 747-400 simulator. During their “flights,” testers seated behind them varied the level of automation in the cockpit, introduced failures into the aircraft systems, and graded their performances. And every two minutes the testers asked the pilots what they were thinking about. Pilots responded by indicating that their thoughts did or did not involve the task at hand.

Changes in the level of automation included turning off the autopilot, which forced the pilots to obey the computer’s instructions; turning off the flight director, leaving pilots to determine their directions from readings on their instruments; and shutting off everything.

That last action required flight crews to keep in their heads the picture of where they were, where they were going, and where they should be going.

“You have to build that picture, rely on it, and update it,” Casner said. “That’s flat-out intellectual work. That’s where we saw more significant problems that worried us and need to be addressed.”

“While pilots’ instrument scanning and aircraft control skills are reasonably well retained when automation is used, the retention of cognitive skills needed for manual flying may depend on the degree to which pilots remain actively engaged in supervising the automation,” the team states in its paper.

“When things go to plan, pilots get invited into the lull – thinking about other things, staring at a piece of automation that’s doing just fine,” Casner explained. “So what happens when a situation arises that the automation cannot deal with? Computers may have some sort of plan that the pilot may not fully understand. The more capable the automation gets, the worse the problem gets, as the pilot is pushed farther and farther away from it.”

“The study and conclusion make sense,” said John Hansman, director of the MIT International Center for Air Transportation.

Modern airliner cockpits bristle with automation.

“We have support in planning the flight, an autopilot system to follow the route, systems that warn us about any kind of component failures, traffic, terrain, and support for pretty much everything there is,” Casner said.

In theory, an airplane can fly itself from one airport to another. But because of the volume of flights around typical airports, the original flight plan must change as the plane nears its destination. That’s when pilots need to understand what the automated system is doing.

“We have this odd paradox that this system, designed to make it easier for us, bizarrely makes it harder, as we program and reprogram the computer,” Casner said.

Pilots’ reactions to the failures that the testers introduced during the study showed that pattern.

“We really just did pretty simple failures, such as air speed, altimeter, and heading indicator. We wanted to look at their process – how they reacted to something that didn’t look right,” Casner said. “Pilots had pretty good awareness of problems, but their [troubleshooting] process was not so good. We’re not really teaching how to plan for investigating and troubleshooting.”

Hansman applauded the study. “Its quality is good for this type of operational human-in-the-loop experiment,” he said. “It is very expensive and difficult to run high-fidelity simulator studies, so it is common to have to use a relatively small study pool. The use of 16 pilots in this group is reasonable.”

To overcome the erosion of cognitive skills, Casner’s team recommends additional practice, during actual flights or on simulators. Another possibility is teaching active monitoring of cockpit automation during long flights. Indeed, some airlines now encourage their pilots to turn off the autopilot and other automation systems occasionally.

“To keep these reasoning, thinking, and troubleshooting skills, we have to practice them,” Casner said. “It’s use it or lose it.”

Meanwhile, his team is extending its studies from air to land. “We’re now looking into automobile driving as cars get more automated,” he said.

 

Filed under
Author Bio & Story Archive

Peter Gwynne is a freelance writer and editor based in Hyannis, Massachusetts, who covers science, technology and medicine.