Auto Pilots: A Problem Lurking While You and the Pilots Relax
As Attention Wanders, Rethinking the Autopilot
By CHRISTINE NEGRONI
Published: May 17, 2010
John Van Beekum for The New York Times
Report on Pilots Who Overshot Airport (December 17, 2009)
Thomas Lohnes/Agence France-Presse — Getty Images
The incident is one of more than a dozen in an airline industry report in which the pilots failed to properly monitor the flight, the automation or even the location of their airplane. The report, which came out in 2008, is getting new attention in light of the most conspicuous recent example of pilots not paying attention: theNorthwest Airlines flight that overshot its destination and traveled on for another 150 miles before turning back to the airport last October.
Whether these incidents are symptoms of a larger problem in the cockpit is the subject of a debate among aviation experts: are airliners so automated that pilots are becoming complacent?
“We at the N.T.S.B. are continuing to see accidents and incidents like the Northwest 188 event,” said a safety board member, Robert L. Sumwalt. “Here it shows we have proof that pilots are still not adequately monitoring the aircraft flight path.”
The problem of pilots losing track of some aspect of a flight is not new. It has been around, in fact, almost as long as the technology that allows computers to perform some pilot functions. In 2002, Mr. Sumwalt was one of three authors of a paper that claimed pilots failed to adequately monitor what the airplane was doing in one-half to three-quarters of the mishaps reviewed. That report was carried out after several incidents, including one in 1996 when the pilots of aContinental Airlines plane failed to make sure the landing gear had been properly lowered before landing.
“Humans are not good monitors over highly automated systems for extended periods of time,” said Mr. Sumwalt, who was a pilot for 24 years at Piedmont and then US Airways. “We want to acknowledge that you can’t expect someone to be extremely vigilant for five or seven or three hours.”
Not all experts agree. The Federal Aviation Administration said that neither the Northwest flight nor other incidents examined by the airline industry indicated a bigger problem.
ots, confused by the complex automation, were losing track of what the airplane was doing.
But the Northwest pilots were on their laptops, Mr. Babbitt said, doing work unrelated to the flight, a prohibited activity. “It doesn’t have anything to do with automation,” he said. “Any opportunity for distraction doesn’t have any business in the cockpit. Your focus should be on flying the airplane.”
Automation is generally considered a positive development in aviation safety because it reduces pilot workload and eliminates errors in calculation and navigation. “The introduction of automation did good things,” said Key Dismukes, chief scientist for aerospace human factors at NASA. But it changed the essential nature of the pilot’s role in the cockpit. “Now the pilot is a manager, which is good, and a monitor, which is not so good.”
Hugh Schoelzel, the vice president of safety at Trans World Airlines — a carrier acquired by American Airlines in 2001 — said most pilots had at one time or another lost track of where they were in flight. “Anyone who says they haven’t is either being disingenuous, or hasn’t been paying attention,” he said.
The episode in which the returning captain had to act quickly to save the airliner from a near stall was discovered by the Commercial Aviation Safety Team, an airline industry group, as it reviewed thousands of reports filed by pilots to NASA’s Aviation Safety Reporting System to see if automation caused pilots to mishandle problems or to become confused or distracted. The group’s study, which ran from 2005 to 2008, highlighted 50 events in the five years prior to 2005. In 16 of them, the pilots’ failure to monitor the automation or the location of the aircraft was cited.
“I’m inclined to say it’s the very reliability of something that takes us out of the loop,” said Mr. Dismukes, who has written about the effects of automation on safety. “You may know, ‘Never turn your back, always check,’ and people may have that intention. But it’s hard to maintain that in practice when you’re not physically controlling the aircraft.”
Finding the balance between too much technology and too little is crucial, according to William B. Rouse, an engineering and computing professor at the Georgia Institute of Technology. “Complacency is an issue, but designing the interaction between human and technical so the human has the right level of judgment when you need them is a design task in itself,” Mr. Rouse said. “When the person has no role in the task, there’s a much greater risk of complacency.”
Some airline pilots confirm this. “We’ve all been there, not intentionally, but because you get distracted from the task at hand,” said a captain at Continental Airlines who did not want to be identified because he was not authorized to speak to the media. Complacency is very subtle, he said, “No light comes on to tell you that you’re being complacent.”
At its meeting this week, the transportation safety board will hold discussions of the actions of pilots and air traffic controllers in two accidents in 2009 — the crash of Colgan Air Flight 3407 in Buffalo and the midair collision between a general aviation airplane and a sightseeing helicopter over the Hudson River.
Those accidents along with the Northwest incident all occurred within seven months of each other and not long after the Hudson River landing of US Airways Flight 1549 that turned the pilots of that flight into icons of professional airmanship.
Chesley B. Sullenberger III, the captain of the US Airways plane, said in an interview that it would be wrong to use his success in safely bringing down that plane to criticize pilots who make mistakes like those on the Northwest plane.
“Something in the system allowed these well-trained, experienced, well-meaning, well-intentioned pilots not to notice where they were, and we need to find out what the root causes are,” he said. “Simply to blame individual practitioners is wrong and it doesn’t solve the underlying issues or prevent it from happening.
”
”
Comments