How Airlines Decide What Counts as a Near Miss

One morning last April, a Delta Airlines passenger jet stormed down a runway at Atlanta’s Hartsfield-Jackson International Airport, hitting 138 mph. It was about to leave for Miami when an air-traffic controller realized he had given the plane clearance to cross the path of another jet that had just landed. He hurriedly told the pilot to abort takeoff, which jolted the passengers and risked damaging the aircraft. Fortunately, there was plenty of runway left for the plane to stop.

What happened that day became one of thousands of incidents captured each year in commercial aviation’s multilayered incident and accident reporting system. The apologetic air-traffic controller filed a report to the U.S. Federal Aviation Administration, or FAA, which decided it wasn’t a potentially disastrous near-miss, and graded it a C, for no danger. Reports like these, which can be matched with technical information on speed and altitude automatically transmitted from aircraft, are submitted under agreements between the FAA, airline operators, service companies, and unions. Many are made anonymously.

Like spy-agency analysts scrutinizing a constant river of messages, airline and FAA safety experts work to head off disasters by searching for trends in these incident reports. By any index, the system has saved lives and money; the most recent commercial aviation fatality occurred in 2009, when a regional jet crashed outside of Buffalo, N.Y. But recently, ground-level safety incidents at U.S. airports have been on the rise: As the Wall Street Journal reported this week, hazardous “runway incursions” jumped 25 percent this fiscal year, increasing for a third year in a row.

There are questions about whether the FAA and airlines are learning all they can. Predictive safety depends on faithful reporting of these incidents, which the FAA defines, in essence, as unexpected mishaps: incidents that could affect safe operations and that involve no serious injury or substantial aircraft damage. Yet there’s no clear line between what does and doesn’t meet this definition. Publicly, only abbreviated summaries are posted for most incidents, and others get longer accounts scrubbed of some details.

Much of what happens in the skies and on the runways, therefore, stays in the skies and on the runways. The safety of flying depends in part on how much data the aviation industry decides to collect—and on what mishaps it determines are truly dangerous.

* * *

Safety experts have known the value of near-miss tracking and root-cause analysis in preventing tragedy since at least 1931, when engineer Herbert William Heinrich theorized in Industrial Accident Prevention: A Scientific Approach that there were 300 near-misses for every 29 accidents and every one serious accident or fatality. In Heinrich’s model, the near-miss incidents are the bottom of a pyramid, the accidents are the next level up, and the fatal accidents are at the top.

Few now take the ratio literally, but the study of precursor events, aided by the processing power of computers and data mining, has helped to revolutionize safety management. In commercial aviation, separate reporting programs exist for airlines, air controllers, pilots, and technicians. The thousands of confessions, complaints, and other electronic data that roll in each quarter detail faux pas big and small, such as a clipped wing on a taxiway, or an unusually turbulent stretch that shakes up the passengers and crew.

The FAA sees anonymity in this reporting as key. “We certainly would not get the transparency and type of data without the anonymity,” says Peggy Gilligan, the organization’s associate administrator of aviation safety.

The closest analogy in day-to-day life to such anonymous or non-punitive data collection would be if every time you accidentally blew through a stop-sign or cut-off another driver on the freeway, you filed a report with your insurer without having to worry about your rate going up or policy renewal being turned down.

Recently, I spent a few days wading through what is publicly available via the Aviation Safety Reporting System, where anonymous reports are organized by types of calamities. I read about how one crew of a Boeing 757 forgot to lower the wing flaps for a daytime landing because their attention was diverted by the crew of a plane ahead who said there was a coyote at the edge of the runway. In another report, the weary pilots of a regional jet on a multi-legged journey landed at an airport without permission. “These kinds of schedules are ridiculous. … [I]n hindsight I’m grateful nothing else happened,” a crew member wrote.

Another database, the FAA Incident/Accident Data System, which is operated by NASA on behalf of the FAA, has a different look and feel and lacks these unedited crew narratives. You can still spot the bare-bones about the Atlanta airport runway incursion involving the two Delta flights, but the database’s advantage is perspective; it shows that there were a dozen runway incursions at Atlanta’s Hartsfield-Jackson International Airport in 2015, most neither close nor very dangerous.

* * *

Recently, a team of scholars that wanted to know if the FAA and airlines were learning from all of their recorded incidents turned to the FAA Accident/Incident Data System and yet another database, the National Transportation Safety Board’s aviation accident database. The National Transportation Safety Board investigates incidents and accidents to make safety recommendations to the FAA, and the FAA has become increasingly cooperative in sharing high-level safety information with the organization.

In an article in the journal Risk Analysis, the researchers lauded the FAA and the industry’s accomplishments of recent years. But when they compared prior accident and near-miss data from 64 airlines over a 17-year stretch, from 1990 to 2007, they found that airlines learn mostly from incidents that conjure the memory of a prior accident. And that could lead pilots and controllers and mechanics to slip into a frame of mind where they routinize close-calls and last-minute adjustments, a natural human tendency toward “the normalization of deviance,” the researchers wrote.

“It’s the ones that don’t scare you that we want the most attention on,” says Robin L. Dillon–Merrill, a professor at Georgetown University and one of the paper’s three co-authors. The researchers write in the study: the “prior near-misses, where risks were taken without negative consequence, deter any search for new routines” and “often reinforce dangerous behavior.”

The reaction to the journal article from Mark Millam, a vice president at the Flight Safety Foundation, a nonprofit that advocates for aviation safety, was typical of multiple experts I spoke to about this study. He conceded that the paper was “an interesting statistical analysis,” but he also had trouble accepting any of the conclusions, because airlines and the FAA have so much data that isn’t made public. (The FAA did not reply in detail to what is in the study, and pointed to their successes in safety as well as incident reporting.)

Dillon-Merrill and her co-authors relied on the FAA Accident/Incident Data System, so the study was based on FAA definitions of near-misses alone. That includes, for example, bird strikes near airports. The FAA considers bird strikes “valuable safety information” that could affect aircraft design or bird-nest control near airports, but they usually don’t trigger engine failure or other damage that could cause a crash or force a dangerous emergency landing.

* * *

Dillon-Merrill certainly acknowledges that classifying events as near-misses is a delicate matter. She believes that the best way to define and use near-misses is as an infrequent alarm or warning signal. Set the criteria too low, she warns, and there is the danger that incident will be so common that they are ignored as pesky nuisances.

There are other perspectives. One is that the relentless collection of data the past two decades has reached a point of diminishing return. After a while, says Shawn Pruchnicki, a former pilot and faculty member at the Ohio State University Center for Aviation Studies, “everyone assumes more data is better, but more isn’t better.”

Pruchnicki, like others who take what’s called a human-factors approach, believes in nurturing a culture that copes with and manages suddenly hazardous situations. Obsession with data, he says, is part of an obsession with rules, and long prescriptive rules are confining. An aborted takeoff, such as the one in April in Atlanta, may not be the culmination of mistakes, but a symbol of a resilient and flexible system. “It’s all about understanding how the system responds to unfavorable events, how we respond, not the nitty gritty details.”

However carefully near-misses are categorized, Dillon-Merrill and her co-authors suggest commercial aviation make incident reporting even easier than it is and collect even more reports on even smaller and less obvious incidents.

So how far should this go? If a pilot swerves or changes altitude suddenly to avoid a mid-air collision, or needs to hit the brakes and abort a take-off, and in neither instance breeches the required separation between aircraft, and no one is hurt and nothing damaged except the peace of mind of the passengers, does that automatically qualify as an incident?

“If something is shaking the passengers up, I believe it should be further investigated,” says Dillon-Merrill—even if it’s ultimately not classified as an incident.

“Unless the passengers are shaken up by everything,” she adds.



from Technology | The Atlantic http://ift.tt/2fZiyQ2

Related Posts