The Community for Technology Leaders
RSS Icon
Issue No.01 - Jan.-Feb. (2013 vol.28)
pp: 84-88
Robert R. Hoffman , Florida Institute for Human and Machine Cognition
Matthew Johnson , Florida Institute for Human and Machine Cognition
Jeffrey M. Bradshaw , Florida Institute for Human and Machine Cognition
Al Underbrink , Sentar
This essay focuses on trust in the automation within macrocognitive work systems. The authors emphasize the dynamics of trust. They consider numerous different meanings or kinds of trust, and different modes of operation in which trust dynamics play a role. Their goal is to contribute to the development of a methodology for designing and analyzing collaborative human-centered work systems, a methodology that might promote both trust "calibration" and appropriate reliance. The analysis suggests an ontology for what the authors call "active exploration for trusting" (AET).
Automation, Collaboration, Management, Human factors, Human computer interaction, competence envelopes, trust, macrocognitive work systems, dynamics, active exploration
Robert R. Hoffman, Matthew Johnson, Jeffrey M. Bradshaw, Al Underbrink, "Trust in Automation", IEEE Intelligent Systems, vol.28, no. 1, pp. 84-88, Jan.-Feb. 2013, doi:10.1109/MIS.2013.24
1. Unmanned Systems Roadmap 2011–, 2036, Office of the Undersecretary of Defense for Acquisition, Technology, and Logistics, Dept. of Defense, 2007.
2. J.K. Hawley, “Not By Widgets Alone,” Armed Forces J., Feb. 2011; /.
3. The Role of Autonomy in DoD Sys-tems, task force report, Defense Science Board, US Dept. of Defense, July 2012.
4. T. Sheridan, “Computer Control and Human Alienation,” Technology Rev., vol. 83, 1980, pp. 61–73.
5. R.R. Hoffman et al., “The Dynamics of Trust in Cyberdomains,” IEEE Intelligent Systems, Nov./Dec. 2009, pp. 5–11.
6. E.W. Fitzhugh, R.R. Hoffman, and J.E. Miller, “Active Trust Management,” Trust in Military Teams, N. Stanton ed., Ashgate, 2011, pp. 197–218.
7. R.C. Mayer, J.H. Davis, and F.D. Schoorman, “An Integrative Model of Organizational Trust,” Academy of Management Rev., vol. 20, no. 3, 1995, pp. 709–734.
8. J.M. Bradshaw et al., “Toward Trustworthy Adjustable Autonomy in KAoS,” Trusting Agents for Trust-worthy Electronic Societies, LNAI, R. Falcone ed., Springer, 2005.
9. C.L. Corritore, B. Kracher, and S. Wiedenbeck, “Online Trust: Concepts, Evolving Themes, a Model,” Int'l J. Human-Computer Studies, vol. 58, 2003, pp. 737–758.
10. J.D. Lee and K.A. See, “Trust in Automation: Designing for Appropriate Reliance,” Human Factors, vol. 46, no. 1, 2004, pp. 50–80.
11. B.M. Muir and N. Moray, “Trust in Automation, Part II: Experimental Studies of Trust and Human Intervention in a Process Control Simulation,” Ergonomics, vol. 39, no. 3, 1996, pp. 429–460.
12. R. Parasuraman and V. Riley, “Human and Automation: Use, Misuse, Disuse, Abuse,” Human Factors, vol. 39, no. 2, 1997, pp. 230–253.
13. Y. Seong and A.M. Bisantz, “The Impact of Cognitive Feedback on Judgment Performance and Trust with Decision Aids,” Int'l J. Industrial Ergonomics, vol. 38, no. 7, 2008, pp. 608–625.
14. E.J. de Visser et al., “The World Is Not Enough: Trust in Cognitive Agents,” Proc. Human Factors and Ergonomics Soc. 56th Ann. Meeting, Human Factors and Ergonomics Soc., 2011, pp. 263–268.
15. P. Madhavan and D.A. Wiegmann, “Effects of Information Source, Pedigree, and Reliability on Operator Interaction with Decision Support Systems,” Human Factors, vol. 49, no. 5, 2007, pp. 773–785.
16. M.T. Dzindolet et al., “The Role of Trust in Automation Reliance,” Int'l J. Human-Computer Studies, vol. 58, no. 6, 2003, pp. 697–718.
17. T.B. Sheridan and W. Verplank, Human, and Computer Control of Undersea Teleoperators, tech. report, Man-Machine Systems Laboratory, Dept. of Mechanical Eng., Mass. Inst. of Technology, 1978.
18. R.R. Hoffman et al., “A Method for Eliciting, Preserving, and Sharing the Knowledge of Forecasters,” Weather and Forecasting, vol. 21, no. 3, 2006, pp. 416–428.
19. D.D. Woods and N.B. Sarter, “Capturing the Dynamics of Attention Control from Individual to Distributed Systems: The Shape of Models to Come,” Theoretical Issues in Ergonomic Science, vol. 11, no. 1, 2010, pp. 7–28.
20. G.A. Klein and R.R. Hoffman, “Seeing the Invisible: Perceptual-Cognitive Aspects of Expertise,” Cognitive Science Foundations of Instruction, M. Rabinowitz ed., 1992, pp. 203–226.
21. P. Koopman, and R.R. Hoffman, “Work-Arounds, Make-Work, and Kludges,” IEEE Intelligent Systems, Nov./Dec. 2003, pp. 70–75.
22. D.D. Woods, “Reflections on 30 Years of Picking Up the Pieces After Explosions of Technology,” AFRL Autonomy Workshop, US Air Force Research Laboratory, Sept. 2011.
23. E.M. Roth, “Facilitating 'Calibrated' Trust in Technology of Dynamically Changing 'Trust-Worthiness,'” Working Meeting on Trust in Cyberdomains, Inst. for Human and Machine Cognition, 2009.
24. S.T. Mueller and G.A. Klein, “Improving Users' Mental Models of Intelligent Software Tools,” IEEE Intelligent Systems, Mar./Apr. 2011, pp. 77–83.
29 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool