Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.
In complex human-machine systems, successful operations depend on an elaborate set of procedures which are specified by the operational management of the organization. These procedures indicate to the human operator (in this case the pilot) the manner in which operational management intends to have various tasks done. The intent is to provide guidance to the pilots and to ensure a safe, logical, efficient, and predictable (standardized) means of carrying out the objectives of the job. However, procedures can become a hodge-podge. Inconsistent or illogical procedures may lead to noncompliance by operators. Based on a field study with three major airlines, the authors propose a model for procedure development called the "Four P's": philosophy, policies, procedures, and practices. Using this model as a framework, the authors discuss the intricate issue of designing flight-deck procedures, and propose a conceptual approach for designing any set of procedures. The various factors, both external and internal to the cockpit, that must be considered for procedure design are presented. In particular, the paper addresses the development of procedures for automated cockpits-a decade-long, and highly controversial issue in commercial aviation. Although this paper is based on airline operations, we assume that the principles discussed here are also applicable to other high-risk supervisory control systems, such as space flight, manufacturing process control, nuclear power production, and military operations.
The majority of the problems pilot encounter when using automated systems center around two factors: (1) the pilot has an incomplete and inadequate model of how the autopilot works; and (2) the displays and flight manuals, provided to the pilot, are inadequate for the task. The tragic accident of Korean Air Lines Flight 007, a Boeing 747 that deviated from its intended flight path, provides a compelling case-study of problems related to pilots' use of automated systems. This paper describes what had happened and exposes two types of human-automation interaction problems: (1) The pilots of KAL were not provided with adequate information about the actual behavior of the autopilot and its mode transition logic; and (2) The autopilot onboard KAL 007 did not provide adequate information to the flight crew about its active and armed modes. Both factors, according to the International Civil Aviation Organization (1993) report on the accident, contributed to the aircraft's lethal navigation error.
Two versions of an electronic checklist have been implemented in the Advanced Concepts Flight Simulator (ACFS) at NASA Research Center. The two designs differ in the degree of pilot involvement in conducting the checklists. One version (manual-sensed), requires the crew to manually acknowledge the completion of each checklist item. The other version (automatic-sensed), automatically indicates completed items without requiring pilot acknowledgement. These two designs and a paper checklist (as a control condition) were evaluated in line oriented simulation. Twelve aircrews from one major air-carrier flew a routine, four leg, short-haul trip. This paper presents and discusses the portion of the experiment that was concerned with measuring the effect of the degree of automation on the crews' performance. It discusses and presents evidence for a potential down-side of implementing an electronic checklist that is designed to provide fully redundant monitoring of human procedure execution and monitoring.
Report analyzes functions, formats, designs, lengths, and usage of normal cockpit checklists, as well as limitations of aircraft personnel who must interact with them. Checklist problems discussed in report also found in other high-risk industries such as: marine, nuclear, and chemical-process industries, as well as, civilian and military air transportation.
The panel will address the relationships between the HF (Human Factors) research community in aviation and aircraft and avionics manufacturers and look at how the two communities could work more efficiently together. The panelists come from both worlds — some of them with a foot in both camps — and have all past experiences of interaction, cooperation, or contracts with the other party.
Standard operating procedures are drafted and provided to flightcrews to dictate the manner in which tasks are carried out. Failure to conform to Standard Operating Procedures (SOP) is frequently listed as the cause of violations, incidents, and accidents. However, procedures are often designed piecemeal, rather than being based on a sound philosophy of operations and policies that follow from such a philosophy. A framework of philosophy, policies, and procedures is proposed.