Trust, Alienation, and How Far to Go with Automation

Trust
If operators do not trust their sensors and displays, expert advisory system, or automatic control system, they will not use it or will avoid using it if possible. On the other hand, if operators come to place too much trust in such systems they will let down their guard, become complacent, and, when it fails, not be prepared. The question of operator trust in the automation is an important current issue in humanmachine interface design. It is desirable that operators trust their systems, but it is also desirable that they maintain alertness, situation awareness, and readiness to take over.

Alienation
There is a set of broader social effects that the new human-machine interaction can have, which can be  discussed under the rubric of alienation.
1. People worry that computers can do some tasks much better than they themselves can, such as memory and calculation. Surely, people should not try to compete in this arena.
2. Supervisory control tends to make people remote from the ultimate operations they are supposed to be overseeing — remote in space, desynchronized in time, and interacting with a computer instead of the end product or service itself.
3. People lose the perceptual-motor skills which in many cases gave them their identity. They become "deskilled", and, if ever called upon to use their previous well-honed skills, they could not.
4. Increasingly, people who use computers in supervisory control or in other ways, whether intentionally or not, are denied access to the knowledge to understand what is going on inside the computer.
5. Partly as a result of factor 4, the computer becomes mysterious, and the untutored user comes to attribute to the computer more capability, wisdom, or blame than is appropriate.
6. Because computer-based systems are growing more complex, and people are being “elevated” to roles of supervising larger and larger aggregates of hardware and software, the stakes naturally become higher. Where a human error before might have gone unnoticed and been easily corrected, now such an error could precipitate a disaster.
7. The last factor in alienation is similar to the first, but all-encompassing, namely, the fear that a “race” of machines is becoming more powerful than the human race.
These seven factors, and the fears they engender, whether justified or not, must be reckoned with.
Computers must be made to be not only “human friendly” but also not alienating with respect to these broader factors. Operators and users must become computer literate at whatever level of sophistication they can deal with.


How Far to Go with Automation
There is no question but that the trend toward supervisory control is changing the role of the human operator, posing fewer requirements on continuous sensory-motor skill and more on planning, monitoring, and supervising the computer. As computers take over more and more of the sensory-motor skill functions, new questions are being raised regarding how the interface should be designed to provide the best cooperation between human and machine. Among these questions are: To what degree should the system be automated? How much “help” from the computer is desirable? What are the points of diminishing returns?
Table lists ten levels of automation, from 0 to 100% computer control. Obviously, there are few tasks which have achieved 100% computer control, but new technology pushes relentlessly in that direction. It is instructive to consider the various intermediate levels of Table 6.1.1 in terms not only of how capable and reliable is the technology but what is desirable in terms of safety and satisfaction of the human operators and the general public.
Scale of Degrees of Automation
1. The computer offers no assistance; the human must do it all.
2. The computer offers a complete set of action alternatives, and
3. Narrows the selection down to a few, or
4. Suggests one alternative, and
5. Executes that suggestion if the human approves, or
6. Allows the human a restricted time to veto before automatic execution, or
7. Executes automatically, then necessarily informs the human, or
8. Informs the human only if asked, or
9. Informs the human only if it, the computer, decides to
10. The computer decides everything and acts autonomously, ignoring the human.
The current controversy about how much to automate large commercial transport aircraft is often couched in these terms



Leave a Reply

About Me

My photo
Cairo, Cairo, Egypt, Egypt
I am the Leader of EME Team.
Powered by Blogger.