advanced vehicle systems; decision making; human-centered automation; human factors; satisficing
Technological advances have made plausible the design of automated systems that share responsibility with a human operator. The decision to use automation to assist or replace a human operator in safety-critical tasks must account for not only the technological capabilities of the sensor and control subsystems, but also the autonomy, capabilities, and preferences of the human operator. By their nature, such human-centered automation problems have multiple attributes: an attribute reflecting human goals and capabilities, and an attribute reflecting automation goals and capabilities. Although good theories exist that describe portions of human behavior generation, in the absence of a general theory of human interaction with complex systems, it is difficult to define and find a unique optimal multiattribute resolution to these competing design requirements. We develop a systematic approach to such problems using a multiattribute decomposition of human and automation goals. This paradigm uses both the satisficing decision principle which is unique to two-attribute problems, and the domination principle which is a common manifestation of the optimality principle in multiattribute domains. As applied to human-centered automation in advanced vehicle systems, the decision method identifies performance valuations and compares the safety benefit of a system intervention against the cost to the human operator. By so formulating the problem, the burden of proof is placed on the automation system: to invoke automation actions, the projected safety-enhancement must be compelling enough to justify the cost to the operator’s autonomy. This effectually integrates human factors considerations into the automation design process from its inception. We illustrate the method by analyzing an automated system to prevent lane departures.