# Guest Editors' Introduction: Human-Centered Computing at NASA

Michael G. Shafto, NASA Ames Research Center
Robert R. Hoffman, Institute for Human & Machine Cognition

Pages: pp. 10-14

Human-centered computing, also called "human-centered systems,"

• Focuses on creating new computational devices
• Is often contrasted with the traditional approach in computer science that might be dubbed "machine-centered computing" 1

That statement also accurately and succinctly describes NASA's working definition of HCC. In the last issue of Intelligent Systems, Robert Hoffman and his colleagues presented a framework encompassing the various approaches that constitute the world of HCC. 1 This special issue examines the particular region NASA is exploring within this vast world—the working definition of HCC shaped by NASA's mission requirements, 2 available resources, and existing investments.

## HCC AND HUMAN-FACTORS ENGINEERING

HCC is influenced by emerging views within human-factors engineering in psychology. Historically, NASA efforts have entailed a great deal of human-factors research on topics including vision, 3 fatigue and circadian factors, 4 attention, 5 cognitive modeling, 6 display design and evaluation, 7 virtual environments, 8 workload models and metrics, 9 design of procedures, 10 human-automation interaction, 11,12 decision making, 13 and group performance. 14,15 In recent years, new, important research topics have emerged, such as how to make automated systems "team players." 16 These topics have mandated an interdisciplinary approach and, more recently, have led to a focus on human-centered design. Both the approach and the focus have changed the landscape of human-factors engineering.

David Woods recommends more proactive involvement of human-factors engineers in the design of new systems, along with increased use of field research methods and strong integration of research with design and testing. 17 His recommendations aim to increase the impact of human-centered design considerations, thereby improving overall system performance and reducing the likelihood of disastrous failures. HCC, in addition to extending our knowledge base about human behavior, focuses on creating new computational devices that amplify and extend human capabilities. In HCC, as in computer science generally, design, capability, and understanding are inextricably joined.

While embracing Woods' vision about human-factors engineers' role in design, HCC practitioners must keep in mind the research side of the research-design merger. HCC aims to deliver real systems, but enthusiasm for your own design concepts or intimidation by implementation schedules sometimes drives out research-oriented concerns. For example, introducing new technologies into complex sociotechnical workplaces invariably changes the nature of the cognitive work of both individuals and teams. In the usual procurement process, after providing a deliverable to the sponsor, the system developers go away. HCC methodology mandates a second wave of empirical effort to ensure usability and usefulness in the envisioned workplace. Unless that happens, HCC might no longer be so clearly "contrasted with the traditional approach in computer science that might be dubbed 'machine-centered computing.'"

## HCC AND COMPUTER SCIENCE

Advances in computer science theory and practice have also uniquely shaped HCC. A core research challenge of HCC is to extend the scope of computer science and software engineering to include distributed systems of human agents and software agents. In particular, design and analysis methods must be extended to cope with the flexibility and adaptability required by present and future NASA missions. Computer science advances 18-20 have been confined mainly to computational systems' internal workings. Many difficult challenges that HCC poses deal with interactions between a computational system and its external environment.

In HCC's concern with these external interactions, it is related to the software engineering subdiscipline known as requirements modeling. 21 By drawing on a knowledge base from the behavioral, cognitive, and social sciences 22 and integrating this knowledge with computational modeling systems, 23 HCC places primary emphasis on exploring and questioning requirements. The refusal to oversimplify requirements engineering is mainly what gives HCC the look and feel of a research activity, despite its firm commitment to the design and implementation of actual systems.

When we build new systems for complex sociotechnical workplaces, we are actually building new tasks and jobs. This might incur grave risks. 24,25 Gaps in our understanding of the real requirements have serious practical impacts, such as

• Weak definitions of mission design alternatives
• Vague and unrealistic assumptions about operations concepts
• Unwillingness to change legacy systems because of unknown impacts
• Fragilities and user-hostile features that force local kludges and work-arounds

HCC as an emerging paradigm envisioned by Woods, Hoffman and his colleagues, and many others, including NASA researchers, is explicitly oriented toward developing a methodology that avoids these risks.

## HCC AND NASA

A mission system consists of the people, facilities, procedures, software, and other technologies required to conduct a NASA mission. Mission systems design must balance the risks of innovation against the costs of conservatism. NASA's traditional approach is heavily weighted toward continuity of current-generation operations with those of prior missions. HCC aims to advance key technologies that will enable NASA to envision and evaluate a broader range of new mission system designs. The goal is to dramatically increase the range of visionary mission options.

NASA's HCC research endeavors to improve the design of mission systems for science and exploration. HCC's practical impacts will include launch and range operations, vehicle processing, and Space Transportation System and International Space Station orbital and ground operations, as well as surface, orbital, and remote planetary exploration operations. HCC is also concerned with advanced air-ground integration concepts in civil aviation.

The four articles in this special issue illustrate four aspects of NASA's HCC research program: sociotechnical systems, multimodal interfaces, human-system modeling, and decision systems.

### Sociotechnical systems

To achieve its technical ambitions, HCC must deploy the combined strengths of behavioral, social, and computer science. Behavioral and social sciences contribute essential methods for analyzing teams and organizations. Computer science contributes powerful theoretical and modeling approaches for describing dynamic hybrid systems. Combining all these methods lets us analyze existing sociotechnical systems and design new ones.

Sociotechnical systems are obviously more difficult to understand than are their software components alone. We find some of the same challenges as in conventional software engineering—interactions among schedule pressure, uncertainty, and dynamically changing situations and priorities. However, we also find other challenges that arise from interactions among software, teams, and the risky environments of aerospace operations.

In "Ethnography, Customers, and Negotiated Interactions at the Airport," Roxana Wales, John O'Neill, and Zara Mirmalek discuss the combined use of behavioral, social, and computational methods to analyze an existing sociotechnical system—the commercial airline passenger system. They use ethnographic methods to build systematic, though informal, models of internal work processes, decision making, software design, and resource allocation as factors in airline passenger delays. They work at the boundaries of these informal models to identify external interactions across the system, exposing additional systematic influences that were invisible under the organizational model of routine air travel, with delays as an exception. They provide several practical suggestions for system-level improvements. "Negotiated interactions"—once described—present opportunities to develop new technology and services. Developing the systems to support an effective system design requires a new model of the airline passenger system and new kinds of user interfaces and intelligent systems.

This work demonstrates a core paradox of HCC, driven by the intimate connection of research methods and design goals: Meticulous description of work practice yields ideas for radical design changes. What seems at first an obsession with the status quo turns into a recipe for transformation.

### Multimodal interfaces

Within sociotechnical systems, multimodal interfaces play a central role in acquiring, transmitting, and rendering information in visual, speech, nonspeech audio, and tactile formats. Methods and formal models for designing suites of such interfaces are in their infancy. NASA is exploring a wide range of interface concepts, including concept maps 26 and speech recognition. 27

Multimodal interfaces must enable decision makers to integrate and interpret high-bandwidth, heterogeneous data for biomedical support, remote science, predictive control, intelligent-vehicle health maintenance, onboard-system management, and procedure execution.

In "Designing Human-Centered Distributed Information Systems," Jiajie Zhang, Vimla Patel, Kathy Johnson, Jane Malin, and Jack Smith consider information management and interface design in a distributed biomedical support system. They move beyond conventional human-computer interaction to a broad consideration of ontologies for interface programming. Interfaces must be described in terms of

• Levels of abstraction and flexible movement among levels
• Multiple representations and expert coordination of multiple representations
• Fast, error-free communication in safety-critical situations, including predictive and preparatory aspects of communication

Zhang and his colleagues' perspective and methods extend our understanding of interface-programming methods from relatively static GUIs to team interfaces that reflect mixed-initiative models of content, activity, and dialog.

Mission-critical displays must fit with the cognitive characteristics of decision makers. This will ensure the reliability of time-pressured decision systems. Conventional display design focuses only at the level of representations, which are usually relatively independent of tasks, users, and functions. HCC focuses attention not just at the level of representations but also at the levels of roles, users, and tasks. This multilevel approach is important for designing information systems that support multiple types of users. Each particular user interface must be analyzed in relation to a larger system, as well as in relation to a set of tasks and procedures.

### Human-system modeling

Understanding the requirements for specific interface designs necessitates the development of human-system models that enable analysis and simulation of key mission activities. A main technical challenge here is to create approximate models of human performance that can be integrated with other system engineering models. 28 As in any type of modeling, model development must be cost effective, 29 and models must be usable at variable and controllable abstraction levels. 30

Systems' life-cycle costs are determined largely by early decisions. Model-based design methods must envision and quantify operational scenarios during these early design phases. Life-cycle costs and risks can be mitigated when early system models include human behavior. Without credible human behavior models, it is too easy to push aside safety, performance, and integration issues with the claim that they will be discovered and dealt with during "training."

Human-system modeling has advanced significantly during the past two decades, owing largely to computer science advances. The software engineering of modeling frameworks is vastly better today than it was in the early 1980s. As a result, human-system modeling frameworks can now "talk to" other modeling frameworks used during systems design. The objective is to improve early design by letting designers use cost-effective models of human behavior, procedural tasks, human-automation interaction, time budgets for individual agents, and consistency checks on synchronous and asynchronous (for example, document-mediated) communication.

In "Modeling and Simulating Work Practice: A Method for Work Systems Design," Maarten Sierhuis and William Clancey examine the capabilities and limitations of current human-system modeling frameworks in the context of NASA mission design. They focus on developing human- system models to apply in early design phases. Their research concentrates on understanding how people and systems are interconnected in practice. To accomplish this, they examine work systems analysis and evaluation and create computational models for simulating how people interact with automated systems. These models, which they created using the Brahms software, 31 capture the activities of teams of experts (hence the allusion to symphonic music) and the behavior of advanced software systems. In both cases, important factors elude our current understanding: Interactions of cognitive, physical, and social factors are not well modeled. Also, important relations among interface structure, display content, and dynamic behavior are not yet well modeled.

The commitment to HCC as a software engineering methodology implies that HCC should be integrated with other advanced software engineering methodologies. Unlike many earlier human-modeling frameworks, Brahms has a sound semantic foundation, essentially compatible with process-algebraic semantics. 20

### Decision systems

At each NASA mission system's core are one or more decision systems that monitor life support systems, acquire and manage science and engineering data, control trajectories, and generally enforce performance and safety boundaries. Decision systems comprise teams of experts and machines, communicating over a network of multimodal interfaces. The cost-effective design and implementation of such systems presents a serious challenge to NASA and to HCC.

In "Intelligent Control of Life Support for Space Missions," Debra Schreckenghost, Carroll Thronesbery, Peter Bonasso, David Kortenkamp, and Cheryl Martin describe how they built a decision system for advanced life support. They highlight the challenges of designing intelligent systems for flexible responses to unpredictable events. The nature of current and planned NASA missions precludes detailed planning and rehearsal of every activity. Communication delays and reliance on intelligent aiding systems require shared frameworks for communication and decision making among crew and software agents. Practical success here will be demonstrated by prototype systems that avoid the recurrent failures that Woods 17 and others have documented.

Schreckenghost and her colleagues test the potential of agent-based tools and multimodal interface technologies to enhance the design, development, and deployment of high-performance human-automation systems in a realistic NASA mission environment. Indeed, NASA conducts all system-level tests of HCC decision systems with the research and mission personnel responsible for mission design and operations. This tight integration of research and operations is a defining characteristic of the HCC paradigm. This has two immediate benefits: Research and development challenges are calibrated against the ground truth of real mission operations, and early products of research and development become available for implementation as rapidly as possible. The goal is to design, prototype, and evaluate more-capable automated mission systems for ground and remote operations. A key challenge that Schreckenghost and her colleagues address is the modeling and representation of the vast knowledge required in NASA mission operations. Success in this effort will also benefit other projects at the Johnson Space Center, the Kennedy Space Center, and the Jet Propulsion Laboratory.

## CONCLUSION

NASA HCC is responding to the challenges that Woods laid out 17 by merging theory and models from computer science with knowledge and methodology from behavioral and social science. This conceptual and methodological integration, which Herbert Simon largely anticipated, 32 is then put into practice via a "forward deployment" strategy that places highly trained researchers directly on the front lines of mission design and implementation. The result is a powerful, agile approach to the design of sociotechnical systems. This approach does not wait for organizational change—it causes change.

## References

• 1. R.R. Hoffman, et al., "A Rose by Any Other Name ... Would Probably Be Given an Acronym," IEEE Intelligent Systems, vol. 17, no. 4, July/Aug. 2002, pp. 72-80.
• 2. Intelligent Systems Program home page, NASA Ames Research Center, Moffett Field, Calif., 2002, http://is.arc.nasa.gov.
• 3. A.B. Watson, and A.J. Ahumada, "Model of Human Visual-Motion Sensing," J. Optical Soc. America, vol. 2, no. 2, 1985, pp. 322-341.
• 4. M.M. Mitler, et al., "Catastrophes, Sleep, and Public Policy: Consensus Report of a Committee for the Association of Professional Sleep Societies," Sleep, vol. 11, 1988, pp. 100-109.
• 5. C.L. Folk, and R.W. Remington, "When Knowledge Doesn't Help: Limitations on the Flexibility of Attentional Control," Converging Operations in the Study of Visual Selective Attention, A. Kramer, M. Coles, and G. Logan, eds., Am. Psychological Assoc., Washington, D.C., 1996.
• 6. M.A. Freed, "Using the RAP System to Simulate Human Error," Proc. 1996 AAAI Fall Symp. Plan Execution: Problems and Issues, AAAI Press, Menlo Park, Calif., 1996, pp. 52-58.
• 7. M.J. Burns, D.L. Warren, and M. Rudisill, "Formatting Space-Related Displays to Optimize Expert and Non-expert User Performance," Proc. CHI 86 Conf. Human Factors in Computing Systems, ACM Press, New York, 1986.
• 8. S.R. Ellis, M.K. Kaiser, and A.J. Grunwald, eds., Pictorial Communication in Virtual and Real Environments, Taylor and Francis, London, 1991.
• 9. S.G. Hart, and C.D. Wickens, "Workload Assessment and Prediction," MANPRINT: An Approach to Systems Integration, H.R. Booher, ed., Van Nostrand Reinhold, New York, 1990, pp. 257-296.
• 10. A. Degani, and E.L. Wiener, "Procedures in Complex Systems: The Airline Cockpit," IEEE Trans. Systems, Man, and Cybernetics, vol. 27, no. 3, 1997, pp. 302-312.
• 11. C.E. Billings, Aviation Automation: The Search for a Human-Centered Approach, Lawrence Erlbaum Associates, Mahwah, N.J., 1997.
• 12. A. Degani, M. Shafto, and A. Kirlik, "Modes in Human-Machine Systems: Review, Classification, and Application," Int'l J. Aviation Psychology, vol. 9, no. 2, 1999, pp. 125-138.
• 13. G.A. Klein et al., eds., Decision Making in Action: Models and Methods, Ablex, Norwood, N.J., 1993.
• 14. H.C. Foushee, and R.L. Helmreich, "Group Interaction and Flight Crew Performance," Human Factors in Aviation, E.L. Wiener and D.C. Nagel, eds., Academic Press, San Diego, Calif., 1988.
• 15. B.G. Kanki, and H.C. Foushee, "Communication as a Group Process Mediator of Aircrew Performance," Aviation, Space, and Environmental Medicine, vol. 60, no. 5, 1989, pp. 402-410.
• 16. J.T. Malin, et al., Making Intelligent Systems Team Players: Case Studies and Design Issues, NASA TM 104738, NASA Johnson Space Center, Houston, Texas, 1991.
• 17. D.D. Woods, Watching Human Factors Watch People at Work (Quicktime video of Presidential Address to the Human Factors and Ergonomics Soc.), 28 Sept. 1999, http://csel.eng.ohio-state.edu/hf99.
• 18. G. Lowe, "Probabilistic and Prioritized Models of Timed CSP," Theoretical Computer Science, vol. 138, no. 2, 1995, pp. 315-352.
• 19. R. Milner, Communicating and Mobile Systems: The $\pi$-Calculus, Cambridge Univ. Press, Cambridge, UK, 1999.
• 20. A.W. Roscoe, Theory and Practice of Concurrency, Prentice-Hall, Upper Saddle River, N.J., 1997.
• 21. J. Mylopoulos, L. Chung, and E. Yu, "From Object-Oriented to Goal-Oriented Requirements Analysis," Comm. ACM, vol. 42, no. 1, Jan. 1999, pp. 31-37.
• 22. J.M. Orasanu, and M.G. Shafto, "Designing for Cognitive Task Performance," Handbook of Systems Engineering and Management, A. Sage and W. Rouse, eds., John Wiley & Sons, New York, 1999.
• 23. W.J. Clancey, "Simulating Activities: Relating Motives, Deliberation, and Attentive Coordination," to be published in Cognitive Systems Rev.; http://home.att.net/~WJClancey/SimulatingActivities.pdf.
• 24. C.A.R. Hoare, "The Emperor's Old Clothes" (Turing award address), Comm. ACM, vol. 24, no. 2, Feb. 1981, pp. 75-83.
• 25. N.G. Leveson, Safeware: System Safety and Computers, Addison-Wesley, Boston, 1995.
• 26. A. Cañas, D.B. Leake, and A. Maguitman, "Combining Concept Mapping with CBR: Towards Experience-Based Support for Knowledge Modeling," Proc. 14th Int'l Florida Artificial Intelligence Research Soc. Conf., AAAI Press, Menlo Park, Calif., 2001, pp. 286-290.
• 27. H. Namarvar, J.S. Liaw, and T.W. Berger, "A New Dynamic Synapse Neural Network for Speech Recognition," Proc. IEEE Int'l Joint Conf. Neural Networks, vol. 4, IEEE CS Press, Los Alamitos, Calif., 2001, pp. 2985-2990.
• 28. M. Sierhuis, Modeling and Simulating Work Practice. Brahms: A Multiagent Modeling and Simulation Language for Work System Analysis and Design, SIKS Dissertation Series, no. 2001-10, Dept. of Social Science and Informatics, Univ. of Amsterdam, Amsterdam, 2001.
• 29. B.E. John, et al., "Automating CPM-GOMS," Proc. CHI 02: Conf. Human Factors in Computing Systems, ACM Press, New York, 2002.
• 30. N.G. Leveson, Papers and Reports on Accidents and Accident Models, Massachusetts Inst. of Technology, Cambridge, Mass., 2002, http://sunnyday.mit.edu/accidents/index.html.
• 31. W.J. Clancey, et al., "Brahms: Simulating Practice for Work Systems Design," Int'l J. Human-Computer Studies, vol. 49, 1998, pp. 831-865.
• 32. H.A. Simon, The Sciences of the Artificial, MIT Press, Cambridge, Mass., 1969.