Human Error

Human error has long been of interest, but only in recent decades has there been serious effort to understand human error in terms of categories, causation, and remedy. There are several ways to classify human errors. One is according to whether it is an error of omission (something not done which was supposed to have been done) or commission (something done which was not supposed to have been done). Another is slip (a correct intention for some reason not fulfilled) vs. a mistake (an incorrect intention which was fulfilled). Errors may also be classified according to whether they are in sensing, perceiving, remembering, deciding, or acting. There are some special categories of error worth noting which are associated with following procedures in operation of systems. One, for example, is called a capture error, wherein the operator, being very accustomed to a series of steps, say, A, B, C, and D, intends at another time to perform E, B, C, F. But he is “captured” by the familiar sequence B, C and does E, B, C, D.
As to effective therapies for human error, proper design to make operation easy and natural and unambiguous is surely the most important. If possible, the system design should allow for error correction before the consequences become serious. Active warnings and alarms are necessary when the system can detect incipient failures in time to take such corrective action. Training is probably next most important after design, but any amount of training cannot compensate for an error-prone design. Preventing exposure to error by guards, locks, or an additional “execute” step can help make sure that the most critical actions are not taken without sufficient forethought. Least effective are written warnings such as posted decals or warning statements in instruction manuals, although many tort lawyers would like us to believe the opposite.


Mental Workload

Under such complexity it is imperative to know whether or not the mental workload of the operator is too great for safety. Human-machine systems engineers have sought to develop measures of mental workload, the idea being that as mental load increases, the risk of error increases, but presumably measurable mental load comes before actual lapse into error.
Three approaches have been developed for measuring mental workload:
1. The first and most used is the subjective rating scale, typically a ten-level category scale with descriptors for each category from no load to unbearable load.
2. The second approach is use of physiological indexes which correlate with subjective scales, including heart rate and the variability of heart rate, certain changes in the frequency spectrum of the voice, electrical resistance of the skin, diameter of the pupil of the eye, and certain changes in the evoked brain wave response to sudden sound or light stimuli.
3. The third approach is to use what is called a secondary task, an easily measurable additional task which consumes all of the operator’s attention remaining after the requirements of the primary task are satisfied. This latter technique has been used successfully in the laboratory, but has shortcomings in practice in that operators may refuse to cooperate.
Such techniques are now routinely applied to critical tasks such as aircraft landing, air traffic control, certain planned tasks for astronauts, and emergency procedures in nuclear power plants. The evidence suggests that supervisory control relieves mental load when things are going normally, but when automation fails the human operator is subjected rapidly to increased mental load.


Human Workload and Human Error

As noted above, new technology allows combination, integration, and simplification of displays compared to the intolerable plethora of separate instruments in older aircraft cockpits and plant control rooms. The computer has taken over more and more functions from the human operator. Potentially these changes make the operator’s task easier. However, it also allows for much more information to be presented, more extensive advice to be given, etc.
These advances have elevated the stature of the human operator from providing both physical energy and control, to providing only continuous control, to finally being a supervisor or a robotic vehicle or system. Expert systems can now answer the operator’s questions, much as does a human consultant, or whisper suggestions in his ear even if he doesn’t request them. These changes seem to add many cognitive functions that were not present at an earlier time. They make the operator into a monitor of the automation, who is supposed to step in when required to set things straight. Unfortunately, people are not always reliable monitors and interveners.

Common Criteria for Human Interface Design

Design of operator control stations for teleoperators poses the same types of problems as design of controls and displays for aircraft, highway vehicles, and trains. The displays must show the important variables unambiguously to whatever accuracy is required, but more than that must show the variables in relation to one another so as to clearly portray the current “situation(situation awareness is currently a popular test of the human operator in complex systems). Alarms must get the operator’s attention, indicate by text, symbol, or location on a graphic display what is abnormal, where in the system the failure occurred, what is the urgency, if response is urgent, and even suggest what action to take. (For example, the ground-proximity warning in an aircraft gives a loud “Whoop, whoop!” followed by a distinct spoken command “Pull up, pull up!”) Controls — whether analogic joysticks, master-arms, or knobs — or symbolic special-purpose buttons or general-purpose keyboards — must be natural and easy to use, and require little memory of special procedures (computer icons and windows do well here). The placement of controls and instruments and their mode and direction of operation must correspond to the desired direction and magnitude of system response.

High-Speed Train Control

With respect to new electronic technology for information sensing, storage, and processing, railroad technology has lagged behind that of aircraft and highway vehicles, but currently is catching up. The role of the human operator in future rail systems is being debated, since for some limited right-of-way trains (e.g., in airports) one can argue that fully automatic control systems now perform safely and efficiently. The train driver’s principal job is speed control (though there are many other monitoring duties he must perform), and in a train this task is much more difficult than in an automobile because of the huge inertia of the train — it takes 2 to 3 km to stop a high-speed train. Speed limits are fixed at reduced levels for curves, bridges, grade crossings, and densely populated areas, while wayside signals temporarily command lower speeds if there is maintenance being performed on the track, if there are poor environmental conditions such as rock slides or deep snow, or especially if there is another train ahead. The driver must obey all speed limits and get to the next station on time. Learning to maneuver the train with its long time constants can take months, given that for the speed control task the driver’s only input currently is an indication of current speed.
The author’s laboratory has proposed a new computer-based display which helps the driver anticipate the future effects of current throttle and brake actions. This approach, based on a dynamic model of the train, gives an instantaneous prediction of future train position and speed based on current acceleration, so that speed can be plotted on the display assuming the operator holds to current brake-throttle settings.
It also plots trajectories for maximum emergency braking and maximum service braking. In addition, the computer generates a speed trajectory which adheres at all (known) future speed limits, gets to the next station on time, and minimizes fuel/energy.


Advanced Traffic Management Systems

Automobile congestion in major cities has become unacceptable, and advanced traffic management systems are being built in many of these cities to measure traffic flow at intersections (by some combination of magnetic loop detectors, optical sensors, and other means), and regulate stoplights and message signs. These systems can also issue advisories of accidents ahead by means of variable message signs or radio, and give advice of alternative routings. In emergencies they can dispatch fire, police, ambulances, or tow trucks, and in the case of tunnels can shut down entering traffic completely if necessary. These systems are operated by a combination of computers and humans from centralized control rooms. The operators look at banks of video monitors which let them see the traffic flow at different locations, and computer-graphic displays of maps, alarm windows, and textual messages. The operators get advice from computer-based expert systems, which suggest best responses based on measured inputs, and the operator must decide whether to accept the computer’s advice, whether to seek further information, and how to respond.

Smart Cruise Control

Standard cruise control has a major deficiency in that it knows nothing about vehicles ahead, and one can easily collide with the rear end of another vehicle if not careful. In a smart cruise control system a microwave or optical radar detects the presence of a vehicle ahead and measures that distance. But there is a question of what to do with this information. Just warn the driver with some visual or auditory alarm (auditory is better because the driver does not have to be looking in the right place)? Can a warning be too late to elicit braking, or surprise the driver so that he brakes too suddenly and causes a rear-end accident to his own vehicle. Should the computer automatically apply the brakes by some function of distance to obstacle ahead, speed, and closing deceleration, If the computer did all the braking would the driver become complacent and not pay attention, to the point where a serious accident would occur if the radar failed to detect an obstacle, say, a pedestrian or bicycle, or the computer failed to brake?
Should braking be some combination of human and computer braking, and if so by what algorithm?
These are human factor questions which are currently being researched.
It is interesting to note that current developmental systems only decelerate and downshift, mostly because if the vehicle manufacturers sell vehicles which claim to perform braking they would be open to a new and worrisome area of litigation.
The same radar technology that can warn the driver or help control the vehicle can also be applied to cars overtaking from one side or the other. Another set of questions then arises as to how and what to communicate to the driver and whether or not to trigger some automatic control maneuver in certain cases.

About Me

My photo
Cairo, Cairo, Egypt, Egypt
I am the Leader of EME Team.
Powered by Blogger.