AUTOMATION AND SITUATION AWARENESS: THE ACCIDENT AT ...

47 downloads 86360 Views 27KB Size Report
Jan 18, 2016 - Automation represents one of the major trends of the 20th century. The drive to provide increased levels of control to electro-mechanical ...
Proceedings of the 9th International Symposium on Aviation Psychology — 1997

AUTOMATION AND SITUATION AWARENESS: THE ACCIDENT AT CALI, COLUMBIA Dr. Mica Endsley Massachusetts Institute of Technology Cambridge, MA Dr. Barry Strauch1 National Transportation Safety Board Washington, D.C. INTRODUCTION The flight management systems (FMSs) of automated aircraft can provide pilots with a considerable amount of flight path information in a readily interpretable manner. These include the predicted path, the relative position of adjacent navaids, airports, and adverse weather activity, as well as the location in space at which the conclusion of a predetermined constant rate climb or descent would be reached. Indeed, flight management systems can navigate so accurately that pilots are permitted to fly almost an entire flight, with the exception of the approach and landing, without reference to navigation charts. As long as the appropriate preprogrammed courses is selected from the data base, and the correct data entered or selected when required, pilots can be confident that the navigation tasks performed will be accurate. Numerous procedural and/or training issues can arise in the interface of these systems with pilots, however. Despite their high reliability, accurate flight path control, and flexible display of critical aircraft-related information, automated flight management systems can actually decrease pilots’ awareness of parameters critical to flight path control through out-of-the-loop performance decrements, over-reliance on automation, and poor human monitoring capabilities. The circumstances of the accident involving an American Airlines (AA) Boeing 757 that struck a mountain while in descent for a landing at Cali, Colombia, on December 20, 1995, and the manner in which the pilots lost situation awareness (SA), reveal much about the nature of situation awareness and the factors that can affect it in a dynamic environment such as an aircraft cockpit. In this paper, we will closely examine the circumstances that contributed to the loss of situation awareness by the pilots and describe how this led ultimately to the accident. THE ACCIDENT As most accidents, the crash of AA 956 was determined not to have been due to a single cause (Aeronautica Civil of the Republic of Colombia, 1996), but rather to a series of factors, none of which alone caused the accident, but when taken together, led to the accident and the resultant loss of life. After an approximate one and one-half hour delay in Miami, Florida, due first to late connections and then to air traffic control delays, the flight to Cali was uneventful until entering Colombia. The first officer was the pilot flying. The captain was communicating with air traffic control and, in accordance with the airline’s policy, managing the FMS. Information from conversation recorded on the airplane’s cockpit voice recorder (CVR) indicates that the captain was concerned that the initial departure delay could delay the next day’s return flight (to allow for a contractually required flight attendant rest period). Upon entering Cali airspace, which was not equipped with air traffic control radar, the following sequence of events occurred: (1) At 2134:57, the pilots interpreted a clearance to proceed to the Cali VOR as a clearance to proceed directly to that fix. This led to their execution of a command to the FMS to proceed direct to the Cali VOR. This command caused the FMS, as designed, to drop an intermediate fix, the Tulua VOR, from the map display and from the navigation computations and commands (2) At 2136:31, the Cali air traffic controller asked the crew if they would accept a clearance to a different runway than the one they had expected and initially prepared for, an offer that they accepted. They were then cleared to execute the approach for that runway, an approach that used the Tulua VOR as the initial approach fix, and were told to execute the Rozo One arrival. Because of their position and altitude, the crew had little time available to descend as needed and perform the steps necessary to prepare to execute the new approach. The CVR contains references to the rapidity with which the pilots attempted to perform the tasks. For example, the captain asked the 1

The views of the second author are his and not necessarily those of the National Transportation Safety Board.

first officer, as the pilot flying, whether he had sufficient time to retrieve the navigation chart that portrayed the arrival route into Cali. (3) At 21:37:29, less than one minute later, the captain asked the controller whether they could proceed to a beacon less than three miles from the runway, known as Rozo and labeled on the approach chart by the abbreviation “R”. The controller replied "affirmative" and repeated the clearance to take the Rozo One arrival and land on runway 19. (4) Investigators learned that when the crew entered "R" into the FMS, the entry brought up not Rozo but another beacon, named Romeo, which was designated as R in the FMS database and located near Bogota, some 130 miles behind the airplane. The evidence indicates that because neither crewmember noticed the location of “R,” one executed an FMS command to proceed directly to “R,” a command that placed the aircraft into a left bank on a flight path away from Cali and the desired approach. For over one minute neither pilot noticed the turn away from Cali. (5) At 2138:49, the crew recognized that the aircraft was off path and headed in the wrong direction. However, both the captain and first officer had considerable difficulty determining their location relative to Tulua, demonstrating a considerable loss of positional SA. They then spent several minutes trying to ascertain their location and flight path while attempting to determine how correct the situation. (6) The crew turned the aircraft towards the desired path to the runway, however, they demonstrated a further loss of SA regarding their distance from the terrain around them. At 2141:15, the sound of the ground proximity warning system (GPWS), indicating an imminent collision, could be heard. (7) The crew attempted an immediate climb away from the terrain but were unsuccessful, largely because they were also unaware that the speedbrakes, designed to aid the airplane in losing lift, were still deployed. Impact occurred 13 seconds later, at 2141:28. THE LOSS OF SITUATION AWARENESS Examination of the crew’s actions raises several questions about the actions and decisions of the two pilots, particularly when considering that they possessed unblemished records and were experienced both in the Boeing 757 and in South American operations. How did they lose awareness of: the terrain, their position, and their continued descent? We believe that the questions are related, and that the loss of situation awareness manifest in this accident was due to a combination of factors, many of which are manifest in the use of current FMS systems in automated cockpits. While the series of errors and circumstances that led to this accident were unique, the root causes associated with each are indicative of a number of problems that are reducing the margin of safety in glass cockpit aircraft. Local versus Global SA The FMS provides significant navigation advantages. The pilot can program in any desired path by entering a series of waypoints into the computer. On Boeing aircraft, the programmed flight path is displayed on an FMSgenerated display through a prominent magenta line, which the automated flight guidance system will duly follow with considerable accuracy. This type of system can provide a high level of local situation awareness — the pilots can readily comprehend exactly where they are in relation to a very specific goal: following a desired path. Maintaining global situation awareness can be a problem, however. Global situation awareness refers to the pilot's knowledge of information relevant to a wide range of goals (Endsley, 1995) Such knowledge is necessary for determining which goals should be active (e.g. knowledge of the nearness of terrain should alert the pilot to work on terrain avoidance) and for rapidly switching to a new goal (e.g. re-routing to a new airport or runway). While certain types of displays may support very narrowly defined tasks well, they may not provide the global situation awareness needed to support a wider range of tasks and goals. In the Cali accident, the pilots faced the challenge of working with the FMS display which, by design, portrayed information about the location of navigational fixes but not environmental features such as terrain. The pilots entered “Direct CLO” (Direct to the Cali VOR) in response to a miscommunication with air traffic control (ATC) which led them to believe they had a clearance to proceed direct to Cali as opposed to following the usual waypoints on designated airways. Requesting and receiving a direct clearance is not uncommon in radar controlled airspace, which, based on their extensive background flying in the U.S., this aircrew was accustomed to. The action of making a direct entry into the FMS had an unfortunate side effect, however. It caused a new flight path to be

presented between the aircraft’s current position and the Cali VOR (labeled CLO) and all intervening waypoints along the original path to disappear. Thus, when the aircrew received a later clearance from ATC to “report Tulua”, they could not find this waypoint (labeled ULQ) on their display or in an FMS-control device. They devoted considerable efforts in a time pressured situation in trying to find ULQ or other points on their display that corresponded to those on the new approach to runway 19. The selected display did not support the global SA needed to detect their location relevant to pertinent landmarks, nor the global SA needed to rapidly change goals (programming in a new flight path). Advances in microprocessing technology may allow future pilots to visualize the proximity of terrain to their flight path. Recent developments have led to what has been termed enhanced ground proximity warning systems (EGPWS) that contains digitized terrain information. This information, when matched against projected flight path, can provide considerable advanced warning to pilots of a flight path that approaches terrain, beyond what is presently capable through GPWS. Since the Cali accident, American Airlines has ordered EGPWS to be installed on their fleet. While there are many human factors issues to be resolved with EGPWS, such systems may greatly increase the global SA available to pilots. This example is also indicative of an underlying problem with the FMS display. Pilots essentially generate their own selective display of the external world, based on the commands they enter. With FMS-equipped aircraft, it is possible to enter any series of waypoints and the aircraft will fly that path, however, except for flying near adverse weather, it can be very difficult to detect if the created path is potentially unsafe or incorrect. For instance, there have been reports of creating flight paths to the wrong location that went undetected and inadvertently creating a flight path that will take the plane directly through a mountain (Wiener, 1988). Without verifying the accuracy of the flight path by comparison with navigation charts, pilots are not able to detect these errors from simply examining the displays and such programming errors are actually fairly easy to make. Hard copy maps might be relied upon to fill in for this gap, however, several problems make this method less than optimal. First, the Cali accident demonstrated that there can be substantial differences in the points and nomenclature used between the two information sources. As a result, determining a correlation between identical points on the two different navigation sources can be both difficult and time consuming. In this accident, the points on the desired flight path were named CF19 and FF19 in the FMS-generated data, and D21 and D16 on the map. It takes considerable calculation to determine that they actually represent the same points, and these calculations are often time-consuming. Secondly, whenever the integration of multiple sources of information for a single task is required, there is a greater chance of error and reduced efficiency in the task. Several researchers have discussed the advantage of integrated information displays for supporting task performance (see Wickens, 1992, for a discussion). The fact that the FMS, a computerized display, provides a highly perceptually salient, central source of information only compounds the problem. In such a situation, it is more likely that a non-salient cue (e.g. small text on a very busy map) will not be adequately attended to (Endsley, 1995). There is a considerable issue therefore, in the need for salient reference information on FMS displays to allow for unsafe and erroneously created flight paths to be detected. Vertical versus Lateral Flight Display A particular deficiency of the FMS, in terms of its ability to support the SA requirements of the pilot, is its presentation of vertical information. The flight path displayed provides only the programmed lateral path of the aircraft. No direct display of either the vertical path of the aircraft nor its relationship to surrounding terrain is provided. A review of CFIT accidents shows that in the majority of cases, the aircraft is correctly aligned laterally with the correct flight path, but the aircraft’s vertical path was not executed properly. That is, the problem appears to be one of correctly perceiving where the aircraft is in relation to the desired vertical flight path. In the Cali accident, the pilots began a descent to 5000 feet mean sea level altitude per an ATC clearance. Due to an FMS programming error they got off course, finally detected the error and reversed the path of the aircraft back to the “extended centerline” (the desired path to the runway). Throughout this process, however, they neither arrested nor slowed the rate of descent. There did not appear to be a lack of SA regarding the current altitude of the aircraft. Some of the last transmissions from the aircraft involved a callout of altitude in response to an ATC request at 10,000 feet and 9,000 feet. There was a complete lack of awareness of the significance of that altitude, however. This would be considered a lack of level 2 situation awareness — comprehension of the significance of perceived information (Endsley, 1995). The pilots demonstrated a lack of awareness of the proximity and altitude of surrounding terrain that would have alerted them to the danger of continuing their descent. A direct display of

vertical path information and its relation to surrounding terrain is not provided by the displays. This state of affairs allowed the crew to continue their descent without questioning its advisability (at least as far as the cockpit voice recorder reveals). Particularly in light of the fact that the lateral path was so clearly and saliently displayed, the lack of salience of vertical information on the FMS was a significant factor in this accident. FMS Interface The FMS used in the Boeing 757, like the FMS in many other aircraft produced by Boeing, Airbus and McDonnell-Douglas, among others, provides navigation and flight control for the aircraft. These functions are directed by the pilot through programming, much as one would program a computer on the office desktop. When events proceed as laid out in the programmed flight path (usually entered on the ground prior to departure), this system works quite well and the pilots' job becomes one of monitoring the system. When things do not go as planned and changes must be made in flight, however, significant difficulties can occur. In this accident, the pilots received a clearance to Runway 19, a runway they were not expecting, and they chose to accept that clearance (possibly in an effort to land as quickly as possible or possibly due to a confirmation bias as they had previously set-up the FMS for Runway 1 per information from the company dispatcher and in accordance with previous experience into the Cali airport.) At that point they needed to find and review the necessary approach charts and perform a number of steps to program the new approach into the FMS. This process was complicated, however, by the fact that an earlier entry to go direct to the Cali VOR had removed the points from the display that were need for creating the proper path. That is, the Tulua VOR (ULQ) was no longer displayed. Last minute runway assignments can create a significant problem for pilots when they necessitate reprogramming the FMS to execute and/or display the new approach. (Pilots routinely do not know for certain what runway they will be assigned in advance of a flight as this information often changes.) Wiener (1985) pointed out a decade ago that this is a significant issue in glass cockpit aircraft, increasing workload when workload is already high, a concern that has been echoed by others. The requirement to reprogram the FMS and cross check the entries at the last minute certainly played a role in this accident. Making a programming error is a frequent occurrence when working with computers in any environment, one people make daily. In the cockpit, however, the consequences of such an error can be very high and the likelihood of catching such an error in a time compressed situation is lower. In this case, the pilots entered “R” to direct the aircraft to fly to a fix on the approach named Rozo. While “R” was the designation for Rozo indicated on the approach chart, it was not the designation used for that point in the FMS database. The "R" they actually selected was assigned to another point in Columbia named Romeo. This was a central error in this accident that sent the aircraft into a 180 degree bank to the left towards Romeo. It was a simple error for the pilots to make, likely induced by the fact that “R” was the expected designation for Rozo and was presented on the charts as such. A poorly understood FMS naming convention led to the designation of R for Romeo and not Rozo in the FMS database. (Romeo was nearer to the larger airport in Columbia, Bogota, and therefore received the designator R. Thus Rozo was assigned its full name in the database.) Jeppesen Sanderson, the developer of the navigation information for the FMS-data base, reported that it follows one set of standards for creating the FMS databases (ARINC 424) and a different set of standards for presenting points on the paper maps (ICAO) (Aeronautica Civil of the Republic of Columbia, 1996). This can lead to a significant problem with lack of consistency between these two sources of information for the pilot. Supervisory Control, Monitoring and Trust In flying a glass cockpit aircraft, the normal mode of flying is indirect, through the programming of the FMS computer. In this role, the pilot becomes a supervisory controller (Sheridan, 1986). Significant performance advantages in terms of flight accuracy can be found for this mode of operation, however, several hazards are also present that appear to be factors in this accident: trust, complacency and out-of-the-loop performance problems. In this accident, several errors occurred that can be directly explained in terms of the design of the FMS, its displays and its database. The most puzzling error that occurred, however, was that following the incorrect entry of “R” into the FMS, approximately one minute and seven seconds elapsed before there was any indication that the flight crew noticed they were in a bank away from Cali. This was despite the fact that the FMS-generated map display was designed to prominently display both the proposed and actual flight path. The most plausible

explanation for this error was that the crew was busy attending to other tasks during this time period (e.g. finding the correct charts, studying the approach, etc....). In an investigation of SA errors based on Aviation Safety Reporting System (ASRS) reports, it was found that not attending to available information was the single greatest category of SA error causal factors (35.1%) and that task distraction was the single greatest reason for this happening (Jones & Endsley, 1996). Several larger issues involved with automated systems are also potential contributors to this lapse. In a time critical situation, it appears that the flight crew trusted the automation to carry out its task (fly to the designated point “R”), as it had many times before. The issue of automation induced complacency has received considerable attention and the degree of trust an operator has in the system can be a significant factor in that complacency (Muir, 1994; Riley, 1994). Pertinent to this accident, Parasuraman, Molloy and Singh (1993) found that poor monitoring of automation is likely to be a problem when dual task conditions are present. In later work, evidence was found that the division of attention to two separate sources of information appears to be a significant factor in this problem (Parasuraman, Molloy, Mouloua, & Hilburn, 1996). In the Cali accident, the fact that the pilots had become loaded with very demanding tasks that required the use of separate, non-integrated sources of information may have contributed to their lack of vigilance in monitoring the automation during the turn. Once the pilots did detect the turn, they spent the next several minutes trying to determine the nature of their navigation difficulty and then correct the problem. This incident is typical of the out-of-the-loop performance problem that has been noted to occur with automated systems. Not only did it take the pilots some time to figure out there was a problem, they also were sufficiently out-of-the-loop such that they had significant difficulty in ascertaining just how they ended up in that position (to understand what the current system state actually was) and trying to figure out how to rectify it. The pilots had significant difficulty in trying to correct the state they found themselves in and their confusion was evident. This is consistent with research that has found lower level 2 SA (comprehension of information) under automation (Endsley & Kiris, 1995). To their credit, the crew tried several tactics, including switching to heading select to bring the aircraft back to the desired path. They never were able to clearly regain their SA, however. They appeared confused as to where to go — to Tulua (which was at that point behind them) to begin the published approach path, or on to Rozo or Cali (as a prior clearance indicated). Adding to this problem, the captain appeared engaged in a state we will call automation fixation. People have grown accustomed to technology working in only fixed ways and they may routinely have to try several tactics to get it to do what they want it to do. Therefore they may engage in a persistence behavior (continuing to repeatedly try different things) which is frequently eventually successful. Engaging in this type of automation fixation may have very negative consequences, however, if the circumstances are such that a wiser course of action would be to give up and do the task in another way (e.g. fly the aircraft manually). Even after all of the problems encountered by this crew, the captain remained intent on trying to program the FMS to fly the approach path. Unfortunately, before they were able to recognize the nature of the navigation difficulty, the GPWS alerted them to the imminence of the terrain. There is no indication in the cockpit discussions that prior to the GPWS warning either crew member was aware of the proximity of the terrain they were flying over. This may be reflective of task narrowing — they were so focused on the lateral navigation problem (finding Rozo or Tulua) and the FMS that they neglected the vertical navigation problem, or it may be a result of the lack of clear and salient terrain information available to them. CONCLUSIONS This accident highlights several issues that will continue to haunt us as the decade of the glass cockpit continues. (1) FMS displays need to provide the required information in a single integrated format. We cannot rely on pilots to integrate multiple, sometimes dissonant sets of information in time-critical, high stress situations. (2) Reprogramming to make runway changes during this phase of flight remains a task which is high workload and has the potential for error. (3) More focus is needed on training pilots to shift from high levels of automation to lower levels (and on training them to recognize the need to do so under strenuous flight conditions). REFERENCES Aeronautica Civil of the Republic of Columbia (1996). Aircraft accident report: Controlled flight into terrain, American Airlines Flight 965, Boeing 757-223, N651AA, near Cali, Columbia, December 20, 1995. Santafe de Bogota, D.C., Columbia: Author.

Endsley, M. R. (1995). Toward a theory of situation awareness. Human Factors, 37(1), 32-64. Endsley, M. R., & Kiris, E. O. (1995). The out-of-the-loop performance problem and level of control in automation. Human Factors, 37(2), 381-394. Jones, D. G., & Endsley, M. R. (1996). Sources of situation awareness errors in aviation. Aviation, Space and Environmental Medicine, 67(6), 507-512. Muir, B. M. (1994). Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems. , 37(11), 1905-1922. Parasuraman, R., Molloy, R., Mouloua, M., & Hilburn, B. (1996). Monitoring of automated systems. In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance: Theory and applications (pp. 91115). Mahwah, NJ: Lawrence Erlbaum Associates. Parasuraman, R., Molloy, R., & Singh, I. L. (1993). Performance consequences of automation-induced complacency. International Journal of Aviation Psychology, 3(1), 1-23. Riley, V. (1994). A theory of operator reliance on automation. In M. Mouloua & R. Parasuraman (Eds.), Human performance in automated systems: Current research and trends (pp. 8-14). Hillsdale, NJ: LEA. Sheridan, T. B. (1986). Supervisory control. In G. Salvendy (Eds.), Handbook of Human Factors. New York: Wiley. Wickens, C. D. (1992). Engineering Psychology and Human Performance (2nd ed.). New York: Harper Collins. Wiener, E. L. (1985). Cockpit automation: In need of a philosophy. In Proceedings of the 1985 Behavioral Engineering Conference (pp. 369-375). Warrendale, PA: Society of Automotive Engineers. Wiener, E. L. (1988). Cockpit automation. In E. L. Wiener & D. C. Nagel (Eds.), Human Factors in Aviation (pp. 433-461). San Diego: Academic Press.