The Fed Side

Near Miss

Lift Ticket by Hank Caruso

How do you write about a near miss? Let’s see; there I was. No, that’s been done before. How about just the facts. Well, there we were. Lowell had just called V1 so we were committed to fly when I saw the glider. It appeared from the brush to the left and rolled onto the runway. It was near enough that I was going to loose sight of it under the nose as it rolled across the runway. I said “glider” but nobody remembers that but me. Spoiler alert, we survived.

Later, Lowell said he had looked up in time to see the problem. Next, he located the airspeed indicator and decided we were going to fly. He was right. We weren’t loaded and I was wrenching on the vericam switch, that’s the trim, because you don’t pull a P2 off the runway without a little help. We also had a Tim sitting wedged in middle in the engineer position. He was blissfully ignorant of the problem.

So how does this happen? According to current thinking, it’s the cheese, Swiss Cheese, the holes lining up. I can’t dispute that. It was our first flight after winter maintenance and ground training. It always seems like there’s a lot going on when I haven’t been in the airplane for awhile. We had been methodical, taking our time doing the run-up, working through the checklists. Lowell doesn’t cut me any slack if I give him the wrong response. The wind was not outrageous, ten to fifteen on the surface but gusty and from the right, having tumbled down Kingsbury grade from Lake Tahoe. The run-up went well, the jets were idling with a high-pitched moan. We hadn’t heard any traffic on Unicom. Lowell announced our intention to occupy runway 16 for the purpose of taking off. We scanned the airspace to the east and the runway as we pulled onto the taxiway in a left turn. I had a view of the approach, reported it clear and then called for the line-up checklist.

I wrestled the controls through to the stops. “Controls free, full travel.” Lowell was writing  the time down then turned the transponder to ALT, positioned the oil coolers and cowls, then barked, “Hatches.”

“Closed left.”

“Closed right. Mixtures.”



I delayed until we were on the runway lined up. “Stage the jets.”

Lowell was already tweaking the jet throttles. He called staged as we aligned with the centerline.

The heading was good on my side.  “Checked left,” I reported.

“Checked right. Line up check complete.”

I set thirty inches MAP on the big round engines, pulled the yoke back in order to see the hydraulic pressures and fuel flow then moved on to the round engine instruments. Looking good. I called for max power, the jets, although we were empty, we wanted to make sure it was available. Lowell pushed the levers up while I began walking the round engine throttles to fifty inches MAP. I hesitated until things stabilized, the left torque lagged. We had noted it on previous ground runs concluding it did not indicate a problem. I released the breaks and we began to roll.

“Fuel flow, torque’s coming up,” reported Lowell. “Airspeeds alive.”

“Mine’s alive,” I added.

The plane accelerated rapidly. Lowell held the nose down with the aileron into the wind. At fifty knots I grabbed the yoke.

“I’ve got the yoke,” I called.

“Eighty knots,” reported Lowell. Shortly thereafter, “V1.”

Then the glider appeared.


When we secured the airplane after our first flight of the year we had visitors. The owner of the glider operation was on the ramp with the instructor pilot who had the excellent view of tanker 48 about to fly.

The scenario. The glider, piloted by a student, had returned to the airport after a lesson with their battery-powered radio having failed. Due to the wind conditions they had opted to land on a closed runway. On final the instructor explained he had taken the controls for the landing because the student was having difficulties controlling the aircraft. I paraphrase.  He appeared contrite.

I suggested, in the future, that an effort should be made to remain clear of the active runway without radio communications.

Although no radio is required at Minden Airport, the glider operation required their gliders to have radios and use them. In the future they planned to operate with two radios if the only option was battery power. They also saw the value of remaining clear to the active runway.

I wrote a narrative about the incident and gave it to the airport manager and submitted a NASA report for the FAA.

A number of people had witnessed the runway incursion, near miss, potential disaster, etc. and it came to the attention of the local FAA DPE who happened to be Minden Airs operations inspector. A period of time lapsed before I got a call from the man. We had a cordial discussion and I did my best to give him my recollection of the incident.

Time passed. I got another call from the man. It seemed he had spent weeks talking to witnesses, reviewing regulations, and doing what FAA inspectors do when they’re solving problems. After a somewhat rambling drawn out review of the facts he announced that I need to accept the blame of the problem. I took exception to his conclusion. He explained that FAA regulations specified gliders have the right-of-way and therefore I needed to accept the blame. I told him I was done talking to him. Spoiler alert. We worked it out. He explained he was required to counsel offenders. I told him I had done my job but I would listen, without comment, until he was done counseling. Once again, I paraphrase.

I don’t think all FAA folks take this sort of approach when attempting to analyze problems and mitigate future risk scenarios. I don’t know what it all means. But that’s the facts Jack.


  1. Dale Head says:

    Thanks for the tale Dean…glad that disaster was averted. Always good to hear about and take heed of things of this sort.

  2. Walt Darran says:

    A little Swiss Cheese is a good way to start the day:

    Reason’s “Swiss Cheese” Model of Human Error
    A Roadmap to a Just Culture:
    Enhancing the Safety Environment
    Prepared by: GAIN Working Group E,
    Flight Ops/ATC Ops Safety Information Sharing
    First Edition • September 2004/
    The term ‘no-blame culture’ flourished in the 1990s and still endures today. Compared to the largely punitive cultures that it sought to replace, it was clearly a step in the right direction. It acknowledged that a large proportion of unsafe acts were ‘honest errors’(the kinds of slips, lapses and mistakes that even the best people can make) and were not truly blameworthy, nor was there much in the way of remedial or preventative benefit to be had by punishing their perpetrators. But the ‘no-blame’ concept had two serious weaknesses. First, it ignored—or, at least, failed to confront—those individuals who willfully (and often repeatedly) engaged in dangerous behaviors that most observers would recognize as being likely to increase the risk of a bad outcome. Second, it did not properly address the crucial business of distinguishing between culpable and non-culpable unsafe acts.
    In my view, a safety culture depends critically upon first negotiating where the line should be drawn between unacceptable behavior and blameless unsafe acts. There will always be a grey area between these two extremes where the issue has to be decided on a case by case basis. This is where the guide-lines provided by A Roadmap to a Just Culture will be of great value. A number of aviation organizations have embarked upon this process, and the general indications are that only around 10 per cent of actions contributing to bad events are judged as culpable. In principle, at least, this means that the large majority of unsafe acts can be reported without fear of sanction. Once this crucial trust has been established, the organization begins to have a reporting culture, something that provides the system with an accessible memory, which, in turn, is the essential underpinning to a learning culture. There will, of course, be setbacks along the way. But engineering a just culture is the all-important early step; so much else depends upon it.
    James Reason
    . Unsafe Acts

    Figure 2. Categories of unsafe acts committed by aircrews.
    Skill-based errors. Skill-based behavior within the context of aviation is best described as “stick-and-rudder” and other basic flight skills that occur without significant conscious thought. As a result, these skill-based actions are particularly vulnerable to failures of attention and/or memory. In fact, attention failures have been linked to many skill-based errors such as the breakdown in visual scan patterns, task fixation, the inadvertent activation of controls, and the misordering of steps in a procedure, among others (Table 1). A classic example is an aircraft’s crew that becomes so fixated on trouble-shooting a burned out warning light that they do not notice their fatal descent into the terrain. Perhaps a bit closer to home, consider the hapless soul who locks himself out of the car or misses his exit because he was either distracted, in a hurry, or daydreaming. These are both examples of attention failures that commonly occur during highly automatized behavior. Unfortunately, while at home or driving around town these attention/ memory failures may be frustrating, in the air they can become catastrophic.

    TABLE 1. Selected examples of Unsafe Acts of Pilot Operators (Note: This is not a complete listing)
    Skillbased Errors
    • Breakdown in visual scan
    • Failed to prioritize attention
    • Inadvertent use of flight controls
    • Omitted step in procedure
    • Omitted checklist item
    • Poor technique
    • Overcontrolled the aircraft
    Decision Errors
    • Improper procedure
    • Misdiagnosed emergency
    • Wrong response to emergency
    • Exceeded ability
    • Inappropriate maneuver
    • Poor decision
    Perceptual Errors (due to)
    • Misjudged distance/altitude/airspeed
    • Spatial disorientation
    • Visual illusion VIOLATIONS
    • Failed to adhere to brief
    • Failed to use the radar altimeter
    • Flew an unauthorized approach
    • Violated training rules
    • Flew an overaggressive maneuver
    • Failed to properly prepare for the flight
    • Briefed unauthorized flight
    • Not current/qualified for the mission
    • Intentionally exceeded the limits of the aircraft
    • Continued low-altitude
    flight in VMC
    • Unauthorized low-altitude
    canyon running
    In contrast to attention failures, memory failures often appear as omitted items in a checklist, place losing, or forgotten intentions. For example, most of us have experienced going to the refrigerator only to forget what we went for. Likewise, it is not difficult to imagine that when under stress during inflight emergencies, critical steps in emergency procedures can be missed. However, even when not particularly stressed, individuals have forgotten to set the flaps on approach or lower the landing gear – at a minimum, an embarrassing gaffe.
    The third, and final, type of skill-based errors identified in many accident investigations involves technique errors. Regardless of one’s training, experience, and educational background, the manner in which one carries out a specific sequence of events may vary greatly. That is, two pilots with identical training, flight grades, and experience may differ significantly in the manner in which they maneuver their aircraft. While one pilot may fly smoothly with the grace of a soaring eagle, others may fly with the darting, rough transitions of a sparrow. Nevertheless, while both may be safe and equally adept at flying, the techniques they employ could set them up for specific failure modes. In fact, such techniques are as much a factor of innate ability and aptitude as they are an overt expression of one’s own personality, making efforts at the prevention and mitigation of technique errors difficult, at best.
    Decision errors. The second error form, decision errors, represents intentional behavior that proceeds as intended, yet the plan proves inadequate or inappropriate for the situation. Often referred to as “honest mistakes,” these unsafe acts represent the actions or inactions of individuals whose “hearts are in the right place,” but they either did not have the appropriate knowledge or just simply chose poorly.
    Perhaps the most heavily investigated of all error forms, decision errors can be grouped into three general categories: procedural errors, poor choices, and problem solving errors (Table 1). Procedural decision errors (Orasanu, 1993), or rule-based mistakes, as described by Rasmussen (1982), occur during highly structured tasks of the sorts, if X, then do Y. Aviation, particularly within the military and commercial sectors, by its very nature is highly structured, and consequently, much of pilot decision making is procedural. There are very explicit procedures to be performed at virtually all phases of flight. Still, errors can, and often do, occur when a situation is either not recognized or misdiagnosed, and the wrong procedure is applied. This is particularly true when pilots are placed in highly time-critical emergencies like an engine malfunction on takeoff.
    However, even in aviation, not all situations have corresponding procedures to deal with them. Therefore, many situations require a choice to be made among multiple response options. Consider the pilot flying home after a long week away from the family who unexpectedly confronts a line of thunderstorms directly in his path. He can choose to fly around the weather, divert to another field until the weather passes, or penetrate the weather hoping to quickly transition through it. Confronted with situations such as this, choice decision errors (Orasanu, 1993), or knowledge-based mistakes as they are otherwise known (Rasmussen, 1986), may occur. This is particularly true when there is insufficient experience, time, or other outside pressures that may preclude correct decisions. Put simply, sometimes we chose well, and sometimes we don’t.
    Finally, there are occasions when a problem is not well understood, and formal procedures and response options are not available. It is during these ill-defined situations that the invention of a novel solution is required. In a sense, individuals find themselves where no one has been before, and in many ways, must literally fly by the seats of their pants. Individuals placed in this situation must resort to slow and effortful reasoning processes where time is a luxury rarely afforded. Not surprisingly, while this type of decision making is more infrequent then other forms, the relative proportion of problem-solving errors committed is markedly higher.
    Perceptual errors. Not unexpectedly, when one’s perception of the world differs from reality, errors can, and often do, occur. Typically, perceptual errors occur when sensory input is degraded or “unusual,” as is the case with visual illusions and spatial disorientation or when aircrew simply misjudge the aircraft’s altitude, attitude, or airspeed (Table 1). Visual illusions, for example, occur when the brain tries to “fill in the gaps” with what it feels belongs in a visually impoverished environment, like that seen at night or when flying in adverse weather. Likewise, spatial disorientation occurs when the vestibular system cannot resolve one’s orientation in space and therefore makes a “best guess” — typically when visual (horizon) cues are absent at night or when flying in adverse weather. In either event, the unsuspecting individual often is left to make a decision that is based on faulty information and the potential for committing an error is elevated.
    It is important to note, however, that it is not the illusion or disorientation that is classified as a perceptual error. Rather, it is the pilot’s erroneous response to the illusion or disorientation. For example, many unsuspecting pilots have experienced “black-hole” approaches, only to fly a perfectly good aircraft into the terrain or water. This continues to occur, even though it is well known that flying at night over dark, featureless terrain (e.g., a lake or field devoid of trees), will produce the illusion that the aircraft is actually higher than it is. As a result, pilots are taught to rely on their primary instruments, rather than the outside world, particularly during the approach phase of flight. Even so, some pilots fail to monitor their instruments when flying at night. Tragically, these aircrew and others who have been fooled by illusions and other disorientating flight regimes may end up involved in a fatal aircraft accident.
    By definition, errors occur within the rules and regulations espoused by an organization; typically dominating most accident databases. In contrast, violations represent a willful disregard for the rules and regulations that govern safe flight and, fortunately, occur much less frequently since they often involve fatalities (Shappell et al., 1999b).
    While there are many ways to distinguish between types of violations, two distinct forms have been identified, based on their etiology, that will help the safety professional when identifying accident causal factors. The first, routine violations, tend to be habitual by nature and often tolerated by governing authority (Reason, 1990). Consider, for example, the individual who drives consistently 5-10 mph faster than allowed by law or someone who routinely flies in marginal weather when authorized for visual meteorological conditions only. While both are certainly against the governing regulations, many others do the same thing. Furthermore, individuals who drive 64 mph in a 55 mph zone, almost always drive 64 in a 55 mph zone. That is, they “routinely” violate the speed limit. The same can typically be said of the pilot who routinely flies into marginal weather.
    What makes matters worse, these violations (commonly referred to as “bending” the rules) are often tolerated and, in effect, sanctioned by supervisory authority (i.e., you’re not likely to get a traffic citation until you exceed the posted speed limit by more than 10 mph). If, however, the local authorities started handing out traffic citations for exceeding the speed limit on the highway by 9 mph or less (as is often done on military installations), then it is less likely that individuals would violate the rules. Therefore, by definition, if a routine violation is identified, one must look further up the supervisory chain to identify those individuals in authority who are not enforcing the rules.
    On the other hand, unlike routine violations, exceptional violations appear as isolated departures from authority, not necessarily indicative of individual’s typical behavior pattern nor condoned by management (Reason, 1990). For example, an isolated instance of driving 105 mph in a 55 mph zone is considered an exceptional violation. Likewise, flying under a bridge or engaging in other prohibited maneuvers, like low-level canyon running, would constitute an exceptional violation. However, it is important to note that, while most exceptional violations are appalling, they are not considered “exceptional” because of their extreme nature. Rather, they are considered exceptional because they are neither typical of the individual nor condoned by authority. Still, what makes exceptional violations particularly difficult for any organization to deal with is that they are not indicative of an individual’s behavioral repertoire and, as such, are particularly difficult to predict. In fact, when individuals are confronted with evidence of their dreadful behavior and asked to explain it, they are often left with little explanation. Indeed, those individuals who survived such excursions from the norm clearly knew that, if caught, dire consequences would follow. Still, defying all logic, many otherwise model citizens have been down this potentially tragic road.
    Preconditions for Unsafe Acts
    Arguably, the unsafe acts of pilots can be directly linked to nearly 80 % of all aviation accidents. However, simply focusing on unsafe acts is like focusing on a fever without understanding the underlying disease causing it. Thus, investigators must dig deeper into why the unsafe acts took place. As a first step, two major subdivisions of unsafe aircrew conditions were developed: substandard conditions of operators and the substandard practices they commit (Figure 3).
    Figure 3. Categories of preconditions of unsafe acts.

    Substandard Conditions of Operators
    Adverse mental states. Being prepared mentally is critical in nearly every endeavor, but perhaps even more so in aviation. As such, the category of Adverse Mental States was created to account for those mental conditions that affect performance (Table 2). Principal among these are the loss of situational awareness, task fixation, distraction, and mental fatigue due to sleep loss or other stressors. Also included in this category are personality traits and pernicious attitudes such as overconfidence, complacency, and misplaced motivation.

    TABLE 2. Selected examples of Unsafe Aircrew Conditions (Note: This is not a complete listing)
    Substandard Conditions of Operators
    Adverse Mental States
    • Channelized attention
    • Complacency
    • Distraction
    • Mental Fatigue
    • Get-home-itis
    • Haste
    • Loss of situational awareness
    • Misplaced motivation
    • Task saturation
    Adverse Physiological States
    • Impaired physiological state
    • Medical illness
    • Physiological incapacitation
    • Physical fatigue
    Physical/Mental Limitation
    • Insufficient reaction time
    • Visual limitation
    • Incompatible intelligence/aptitude
    • Incompatible physical capability Substandard Practice of Operators
    Crew Resource Management
    • Failed to back-up
    • Failed to communicate/coordinate
    • Failed to conduct adequate brief
    • Failed to use all available resources
    • Failure of leadership
    • Misinterpretation of traffic calls
    Personal Readiness
    • Excessive physical training
    • Self-medicating
    • Violation of crew rest requirement
    • Violation of bottle-to-throttle requirement
    Predictably, if an individual is mentally tired for whatever reason, the likelihood increase that an error will occur. In a similar fashion, overconfidence and other pernicious attitudes such as arrogance and impulsivity will influence the likelihood that a violation will be committed. Clearly then, any framework of human error must account for preexisting adverse mental states in the causal chain of events.
    Adverse physiological states. The second category, adverse physiological states, refers to those medical or physiological conditions that preclude safe operations (Table 2). Particularly important to aviation are such conditions as visual illusions and spatial disorientation as described earlier, as well as physical fatigue, and the myriad of pharmacological and medical abnormalities known to affect performance.
    The effects of visual illusions and spatial disorientation are well known to most aviators. However, less well known to aviators, and often overlooked are the effects on cockpit performance of simply being ill. Nearly all of us have gone to work ill, dosed with over-the-counter medications, and have generally performed well. Consider however, the pilot suffering from the common head cold. Unfortunately, most aviators view a head cold as only a minor inconvenience that can be easily remedied using over-the counter antihistamines, acetaminophen, and other non-prescription pharmaceuticals. In fact, when confronted with a stuffy nose, aviators typically are only concerned with the effects of a painful sinus block as cabin altitude changes. Then again, it is not the overt symptoms that local flight surgeons are concerned with. Rather, it is the accompanying inner ear infection and the increased likelihood of spatial disorientation when entering instrument meteorological conditions that is alarming – not to mention the side-effects of antihistamines, fatigue, and sleep loss on pilot decision-making. Therefore, it is incumbent upon any safety professional to account for these sometimes subtle medical conditions within the causal chain of events.
    Physical/Mental Limitations. The third, and final, substandard condition involves individual physical/ mental limitations (Table 2). Specifically, this category refers to those instances when mission requirements exceed the capabilities of the individual at the controls. For example, the human visual system is severely limited at night; yet, like driving a car, drivers do not necessarily slow down or take additional precautions. In aviation, while slowing down isn’t always an option, paying additional attention to basic flight instruments and increasing one’s vigilance will often increase the safety margin. Unfortunately, when precautions are not taken, the result can be catastrophic, as pilots will often fail to see other aircraft, obstacles, or power lines due to the size or contrast of the object in the visual field.
    Similarly, there are occasions when the time required to complete a task or maneuver exceeds an individual’s capacity. Individuals vary widely in their ability to process and respond to information. Nevertheless, good pilots are typically noted for their ability to respond quickly and accurately. It is well documented, however, that if individuals are required to respond quickly (i.e., less time is available to consider all the possibilities or choices thoroughly), the probability of making an error goes up markedly. Consequently, it should be no surprise that when faced with the need for rapid processing and reaction times, as is the case in most aviation emergencies, all forms of error would be exacerbated.
    In addition to the basic sensory and information processing limitations described above, there are at least two additional instances of physical/mental limitations that need to be addressed, albeit they are often overlooked by most safety professionals. These limitations involve individuals who simply are not compatible with aviation, because they are either unsuited physically or do not possess the aptitude to fly. For example, some individuals simply don’t have the physical strength to operate in the potentially high-G environment of aviation, or for anthropometric reasons, simply have difficulty reaching the controls. In other words, cockpits have traditionally not been designed with all shapes, sizes, and physical abilities in mind. Likewise, not everyone has the mental ability or aptitude for flying aircraft. Just as not all of us can be concert pianists or NFL linebackers, not everyone has the innate ability to pilot an aircraft – a vocation that requires the unique ability to make decisions quickly and respond accurately in life threatening situations. The difficult task for the safety professional is identifying whether aptitude might have contributed to the accident causal sequence.
    Substandard Practices of Operators
    Clearly then, numerous substandard conditions of operators can, and do, lead to the commission of unsafe acts. Nevertheless, there are a number of things that we do to ourselves that set up these substandard conditions. Generally speaking, the substandard practices of operators can be summed up in two categories: crew resource mismanagement and personal readiness.
    Crew Resource Mismanagement. Good communication skills and team coordination have been the mantra of industrial/organizational and personnel psychology for decades. Not surprising then, crew resource management has been a cornerstone of aviation for the last few decades (Helmreich & Foushee, 1993). As a result, the category of crew resource mismanagement was created to account for occurrences of poor coordination among personnel. Within the context of aviation, this includes coordination both within and between aircraft with air traffic control facilities and maintenance control, as well as with facility and other support personnel as necessary. But aircrew coordination does not stop with the aircrew in flight. It also includes coordination before and after the flight with the brief and debrief of the aircrew.
    It is not difficult to envision a scenario where the lack of crew coordination has led to confusion and poor decision making in the cockpit, resulting in an accident. In fact, aviation accident databases are replete with instances of poor coordination among aircrew. One of the more tragic examples was the crash of a civilian airliner at night in the Florida Everglades in 1972 as the crew was busily trying to troubleshoot what amounted to a burnt out indicator light. Unfortunately, no one in the cockpit was monitoring the aircraft’s altitude as the altitude hold was inadvertently disconnected. Ideally, the crew would have coordinated the trouble-shooting task ensuring that at least one crewmember was monitoring basic flight instruments and “flying” the aircraft. Tragically, this was not the case, as they entered a slow, unrecognized, descent into the everglades resulting in numerous fatalities.
    Personal Readiness. In aviation, or for that matter in any occupational setting, individuals are expected to show up for work ready to perform at optimal levels. Nevertheless, in aviation as in other professions, personal readiness failures occur when individuals fail to prepare physically or mentally for duty. For instance, violations of crew rest requirements, bottle-to-brief rules, and self-medicating all will affect performance on the job and are particularly detrimental in the aircraft. It is not hard to imagine that, when individuals violate crew rest requirements, they run the risk of mental fatigue and other adverse mental states, which ultimately lead to errors and accidents. Note however, that violations that affect personal readiness are not considered “unsafe act, violation” since they typically do not happen in the cockpit, nor are they necessarily active failures with direct and immediate consequences.
    Still, not all personal readiness failures occur as a result of violations of governing rules or regulations. For example, running 10 miles before piloting an aircraft may not be against any existing regulations, yet it may impair the physical and mental capabilities of the individual enough to degrade performance and elicit unsafe acts. Likewise, the traditional “candy bar and coke” lunch of the modern businessman may sound good but may not be sufficient to sustain performance in the rigorous environment of aviation. While there may be no rules governing such behavior, pilots must use good judgment when deciding whether they are “fit” to fly an aircraft.
    Unsafe Supervision
    Figure 4. Categories of unsafe supervision.

    Inadequate Supervision.
    The role of any supervisor is to provide the opportunity to succeed. To do this, the supervisor, no matter at what level of operation, must provide guidance, training opportunities, leadership, and motivation, as well as the proper role model to be emulated. Unfortunately, this is not always the case.
    For example, it is not difficult to conceive of a situation where adequate crew resource management training was either not provided, or the opportunity to attend such training was not afforded to a particular aircrew member. Conceivably, aircrew coordination skills would be compromised and if the aircraft were put into an adverse situation (an emergency for instance), the risk of an error being committed would be exacerbated and the potential for an accident would increase markedly.
    In a similar vein, sound professional guidance and oversight is an essential ingredient of any successful organization. While empowering individuals to make decisions and function independently is certainly essential, this does not divorce the supervisor from accountability. The lack of guidance and oversight has proven to be the breeding ground for many of the violations that have crept into the cockpit. As such, any thorough investigation of accident causal factors must consider the role supervision plays (i.e., whether the supervision was inappropriate or did not occur at all) in the genesis of human error (Table 3).

    TABLE 3. Selected examples of Unsafe Supervision (Note: This is not a complete listing)
    Inadequate Supervision
    • Failed to provide guidance
    • Failed to provide operational doctrine
    • Failed to provide oversight
    • Failed to provide training
    • Failed to track qualifications
    • Failed to track performance
    Planned Inappropriate Operations
    • Failed to provide correct data
    • Failed to provide adequate brief time
    • Improper manning
    • Mission not in accordance with rules/regulations
    • Provided inadequate opportunity for crew rest Failed to Correct a Known Problem
    • Failed to correct document in error
    • Failed to identify an at-risk aviator
    • Failed to initiate corrective action
    • Failed to report unsafe tendencies
    Supervisory Violations
    • Authorized unnecessary hazard
    • Failed to enforce rules and regulations
    • Authorized unqualified crew for flight
    Planned Inappropriate Operations.
    Occasionally, the operational tempo and/or the scheduling of aircrew is such that individuals are put at unacceptable risk, crew rest is jeopardized, and ultimately performance is adversely affected. Such operations, though arguably unavoidable during emergencies, are unacceptable during normal operations. Therefore, the second category of unsafe supervision, planned inappropriate operations, was created to account for these failures (Table 3).
    Take, for example, the issue of improper crew pairing. It is well known that when very senior, dictatorial captains are paired with very junior, weak co-pilots, communication and coordination problems are likely to occur. Commonly referred to as the trans-cockpit authority gradient, such conditions likely contributed to the tragic crash of a commercial airliner into the Potomac River outside of Washington, DC, in January of 1982 (NTSB, 1982). In that accident, the captain of the aircraft repeatedly rebuffed the first officer when the latter indicated that the engine instruments did not appear normal. Undaunted, the captain continued a fatal takeoff in icing conditions with less than adequate takeoff thrust. The aircraft stalled and plummeted into the icy river, killing the crew and many of the passengers.
    Clearly, the captain and crew were held accountable. They died in the accident and cannot shed light on causation; but, what was the role of the supervisory chain? Perhaps crew pairing was equally responsible. Although not specifically addressed in the report, such issues are clearly worth exploring in many accidents. In fact, in that particular accident, several other training and manning issues were identified.
    Failure to Correct a Known Problem.
    The third category of known unsafe supervision, Failed to Correct a Known Problem, refers to those instances when deficiencies among individuals, equipment, training or other related safety areas are “known” to the supervisor, yet are allowed to continue unabated (Table 3). For example, it is not uncommon for accident investigators to interview the pilot’s friends, colleagues, and supervisors after a fatal crash only to find out that they “knew it would happen to him some day.” If the supervisor knew that a pilot was incapable of flying safely, and allowed the flight anyway, he clearly did the pilot no favors. The failure to correct the behavior, either through remedial training or, if necessary, removal from flight status, essentially signed the pilot’s death warrant – not to mention that of others who may have been on board.
    Likewise, the failure to consistently correct or discipline inappropriate behavior certainly fosters an unsafe atmosphere and promotes the violation of rules. Aviation history is rich with by reports of aviators who tell hair-raising stories of their exploits and barnstorming low-level flights (the infamous “been there, done that”). While entertaining to some, they often serve to promulgate a perception of tolerance and “one-up-manship” until one day someone ties the low altitude flight record of ground-level! Indeed, the failure to report these unsafe tendencies and initiate corrective actions is yet another example of the failure to correct known problems.
    Supervisory Violations.
    Supervisory violations, on the other hand, are reserved for those instances when existing rules and regulations are willfully disregarded by supervisors (Table 3). Although arguably rare, supervisors have been known occasionally to violate the rules and doctrine when managing their assets. For instance, there have been occasions when individuals were permitted to operate an aircraft without current qualifications or license. Likewise, it can be argued that failing to enforce existing rules and regulations or flaunting authority are also violations at the supervisory level. While rare and possibly difficult to cull out, such practices are a flagrant violation of the rules and invariably set the stage for the tragic sequence of events that predictably follow.
    One particularly appealing approach to the genesis of human error is the one proposed by James Reason (1990). Generally referred to as the “Swiss cheese” model of human error, Reason describes four levels of human failure, each influencing the next (Figure 1). Working backwards in time from the accident, the first level depicts those Unsafe Acts of Operators that ultimately led to the accident[1] More commonly referred to in aviation as aircrew/pilot error, this level is where most accident investigations have focused their efforts and consequently, where most causal factors are uncovered. After all, it is typically the actions or inactions of aircrew that are directly linked to the accident. For instance, failing to properly scan the aircraft’s instruments while in instrument meteorological conditions (IMC) or penetrating IMC when authorized only for visual meteorological conditions (VMC) may yield relatively immediate, and potentially grave, consequences. Represented as “holes” in the cheese, these active failures are typically the last unsafe acts committed by aircrew.
    [1] Reason’s original work involved operators of a nuclear power plant. However, for the purposes of this manuscript, the operators here refer to aircrew, maintainers, supervisors and other humans involved in aviation.
    However, what makes the “Swiss cheese” model particularly useful in accident investigation, is that it forces investigators to address latent failures within the causal sequence of events as well. As their name suggests, latent failures, unlike their active counterparts, may lie dormant or undetected for hours, days, weeks, or even longer, until one day they adversely affect the unsuspecting aircrew. Consequently, they may be overlooked by investigators with even the best intentions.
    Within this concept of latent failures, Reason described three more levels of human failure. The first involves the condition of the aircrew as it affects performance. Referred to as Preconditions for Unsafe Acts, this level involves conditions such as mental fatigue and poor communication and coordination practices, often referred to as crew resource management (CRM). Not surprising, if fatigued aircrew fail to communicate and coordinate their activities with others in the cockpit or individuals external to the aircraft (e.g., air traffic control, maintenance, etc.), poor decisions are made and errors often result.
    Figure 1. The “Swiss cheese” model of human error causation (adapted from Reason, 1990).
    But exactly why did communication and coordination break down in the first place? This is perhaps where Reason’s work departed from more traditional approaches to human error. In many instances, the breakdown in good CRM practices can be traced back to instances of Unsafe Supervision, the third level of human failure. If, for example, two inexperienced (and perhaps even below average pilots) are paired with each other and sent on a flight into known adverse weather at night, is anyone really surprised by a tragic outcome? To make matters worse, if this questionable manning practice is coupled with the lack of quality CRM training, the potential for miscommunication and ultimately, aircrew errors, is magnified. In a sense then, the crew was “set up” for failure as crew coordination and ultimately performance would be compromised. This is not to lessen the role played by the aircrew, only that intervention and mitigation strategies might lie higher within the system.
    Reason’s model didn’t stop at the supervisory level either; the organization itself can impact performance at all levels. For instance, in times of fiscal austerity, funding is often cut, and as a result, training and flight time are curtailed. Consequently, supervisors are often left with no alternative but to task “non-proficient” aviators with complex tasks. Not surprisingly then, in the absence of good CRM training, communication and coordination failures will begin to appear as will a myriad of other preconditions, all of which will affect performance and elicit aircrew errors. Therefore, it makes sense that, if the accident rate is going to be reduced beyond current levels, investigators and analysts alike must examine the accident sequence in its entirety and expand it beyond the cockpit. Ultimately, causal factors at all levels within the organization must be addressed if any accident investigation and prevention system is going to succeed.
    In many ways, Reason’s “Swiss cheese” model of accident causation has revolutionized common views of accident causation. Unfortunately, however, it is simply a theory with few details on how to apply it in a real-world setting. In other words, the theory never defines what the “holes in the cheese” really are, at least within the context of everyday operations. Ultimately, one needs to know what these system failures or “holes” are, so that they can be identified during accident investigations or better yet, detected and corrected before an accident occurs.

    The Human Factors
    Analysis and Classification
    Scott A. Shappell
    FAA Civil Aeromedical Institute
    Oklahoma City, OK 73125
    Douglas A. Wiegmann
    University of Illinois at Urbana-Champaign
    Institute of Aviation
    Savoy, IL 61874

Speak Your Mind