This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Improving the Reliability of Artificial Intelligence Planning Systems by Analyzing their Failure Recovery
February 1995 (vol. 7 no. 1)
pp. 14-25

Abstract—As planning technology improves, Artificial Intelligence planners are being embedded in increasingly complicated environments: ones that are particularly challenging even for human experts. Consequently, failure is becoming both increasingly likely for these systems (due to the difficult and dynamic nature of the new environments) and increasingly important to address (due to the systems’ potential use on real world applications). This paper describes the development of a failure recovery component for a planner in a complex simulated environment and a procedure (called Failure Recovery Analysis) for assisting programmers in debugging that planner. The failure recovery design is iteratively enhanced and evaluated in a series of experiments. Failure Recovery Analysis is described and demonstrated on an example from the Phoenix planner. The primary advantage of these approaches over existing approaches is that they are based on only a weak model of the planner and its environment, which makes them most suitable when the planner is being developed. By integrating them, failure recovery and Failure Recovery Analysis improve the reliability of the planner by repairing failures during execution and identifying failures due to bugs in the planner and failure recovery itself.

[1] J.A. Ambros-Ingerson and Sam Steel,“Integrating planning, execution and monitoring,” Proceedings of the Seventh National Conference on Artificial Intelligence,Minneapolis, Minnesota, 1988. American Association for Artificial Intelligence, pp. 83-88.
[2] R.A. Brooks,“Symbolic error analysis and robot planning,” International Journal of Robotics Research, Vol. 1, No. 4, Winter 1982, pp. 29-68.
[3] C.A. Broverman,“Constructive Interpretation of Human-Generated ExceptionsDuring Plan Execution,” Ph.D. thesis, COINS Dept, University of Massachusetts, Amherst, MA, February 1991.
[4] P.R. Cohen,M. Greenberg,D.M. Hart,, and A.E. Howe,“Trial by fire: understanding the design requirements for agents in complex environments,” AI Magazine, Vol. 10, No. 3, Fall 1989.
[5] F.J. Corbato,“On building systems that will fail,” Communications of the ACM, Vol 34, No. 9, September 1991, pp. 72-81.
[6] M. Gini,“Automatic error detection and recovery,” Computer Science Dept. 88-48, University of Minnesota, Minneapolis, MN, June 1988.
[7] K.J. Hammond,“Case-Based Planning: An Integrated Theory of Planning,Learning and Memory,” Ph.D. thesis, Dept. of Computer Science, Yale University, New Haven, CT, October 1986.
[8] S. Hanks and R.J. Firby,“Issues and architectures for planning and execution,” Katia P. Sycara, editor, Proceedings of the Workshop on Innovative Approaches to Planning, Scheduling and Control,Palo Alto, CA., November 1990. Morgan Kaufmann Publishers, Inc., pp. 71-76
[9] A.E. Howe,“Analyzing failure recovery to improve planner design,” Proceedings of the Tenth National Conference on Artificial Intelligence, July 1992 pp. 387-393.
[10] A.E. Howe,“Accepting the Inevitable: The Role of Failure Recovery in theDesign of Planners,” Ph.D. thesis, University of Massachusetts,Department of Computer Science, Amherst, MA, February 1993.
[11] A.E. Howe and P.R. Cohen,“Failure recovery: A model and experiments,” Proceedings of the Ninth National Conference on Artificial Intelligence,Anaheim, CA, July 1991, pp. 801-808.
[12] S. Kambhampati and J.A. Hendler, "A Validation Structure Based Theory of Plan Modification and Reuse," Artificial Intelligence J., vol. 55, pp. 193-258, 1992.
[13] V. Lesser, “A Retrospective View of FA/C Distributed Problem Solving,” IEEE Systems, Man, and Cybernetics, Vol. 21, No. 6, 1991, pp. 1346–1363.
[14] N.G. Leveson,“Software safety: Why, what, and how,” Computing Surveys, Vol 18, No 2, June 1986, pp 125-163.
[15] D.M. Lyons,R. Vijaykumar,, and S.T. Venkataraman,“A representation for error detection and recovery in robot plans,” Proceedings of SPIE Symposium on Intelligent Control and Adaptive Systems,Philadelphia, November 1989, pp pp. 14-25.
[16] D.P. Miller,“Execution monitoring for a mobile robot system,” Proceedings of SPIE Symposium on Intelligent Control and Adaptive Systems,Philadelphia, PA, November 1989 pp. 36-43.
[17] S. Minton,M.D. Johnston,A.B. Philips,, and P. Laird,“Solving large-scale constraint satisfaction and scheduling problems using a heuristic repair method,” Proceedings of the Ninth National Conference on ArtificialIntelligence,Anaheim, CA, 1991. American Association for Artificial Intelligence, pp. 17-24.
[18] L. Morgenstern,“Replanning,” Proceedings of the DARPA Knowledge-Based Planning Workshop,Austin, TX, December 1987, pp. 5-1 - 5-10.
[19] N. H. Narayanan and N. Viswanadham,“A methodology for knowledge acquisition and reasoning in failure analysis of systems,” IEEE Transactions on Systems, Man and Cybernetics, SMC Vol. 17, No. 2, March/April 1987, pp. 274-288.
[20] S.Y. Nof,O.Z. Maimon,, and R. G. Wilhelm,“Experiments for planning error-recovery program in robotic work,” Proceedings of the 1987 ASME International Computers in Engineering Conference, N.Y., N.Y, August 1987, pp. 253-262.
[21] C. Owens,“Representing abstract plan failures,” Proceedings of the Twelfth Cognitive Science Conference,Boston, MA, 1990. Cognitive Science Society, pp. 277-284,.
[22] H.J. Porta,“Dynamicreplanning,” Proceedings of ROBEXS, Vol 86, Second Annual Workshop on Robotics and Expert Systems, June 1986, pp. 109-115.
[23] R. Simmons,“Monitoring and error recovery for autonomous walking,” In Proc. IEEE International Workshop on Intelligent Robots and Systems, July 1992, pp. 1407-1412.
[24] R.G. Simmons,“A theory of debugging plans and interpretations,” Proceedings of the Seventh National Conference on Artificial Intelligence,Minneapolis, MN, 1988. American Association for Artificial Intelligence, pp. 94-99.
[25] H.A. Simon and J.B. Kadane,“Optimal problem-solving search: all-or-none solutions,” Artificial Intelligence Journal, Vol. 6, 1975, pp 235-247.
[26] S.F. Smith,P.S. Ow,N. Muscettola,J.-Y. Potvin,, and D.C. Matthys,“’OPIS’an integrated framework for generating and revising factory schedules,” Katia P. Sycara, editor, Proceedings of the Workshop on Innovative Approaches to Planning, Scheduling and Control, pp. 497-507. Morgan Kaufmann Publishers, Inc, November 1990.
[27] S. Srinivas,“Error Recovery in Robot Systems,” Ph.D. thesis, California Institute of Tech nology, Pasadena, CA, 1977.
[28] G.A. Sussman,“A computational model of skill acquisition,” Technical Report Memo, No. AI-TR-297, MIT AI Lab, 1973.
[29] K. Sycara,“Using case-based reasoning for plan adaptation and repair,” Proceedings of a Workshop on Case-Based Reasoning, pp. 425-434. Morgan Kaufmann Publishers, Inc., 1988.
[30] D.E. Wilkins,“Recovering from execution errors in’SIPE’,” Technical Report 346, Artificial Intelligence Center,Computer Science and Technology Center,SRI International, 1985.
[31] R. Zito-Wolf and R. Alterman,“Ad-hoc, fail-safe plan learning,” Proceedings of the Twelfth Cognitive Science Conference, pp. 908-913,Boston, MA, July24-28

Index Terms:
artificial intelligence, planning, failure recovery, reliability, debugging.
Citation:
Adele E. Howe, "Improving the Reliability of Artificial Intelligence Planning Systems by Analyzing their Failure Recovery," IEEE Transactions on Knowledge and Data Engineering, vol. 7, no. 1, pp. 14-25, Feb. 1995, doi:10.1109/69.368521
Usage of this product signifies your acceptance of the Terms of Use.