This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Comparative Fault Tolerance of Parallel Distributed Processing Networks
November 1994 (vol. 43 no. 11)
pp. 1323-1329

We propose a method for evaluating and comparing the fault tolerance of a wide variety of parallel distributed processing networks (more commonly referred to as artificial neural networks). Despite the fact that these computing networks are biologically inspired and share many features of biological neural networks, they are not inherently tolerant of the loss of processing elements. We examine two classes of networks, multilayer perceptrons and Gaussian radial basis function networks, and show that there is a marked difference in their operational fault tolerance. Furthermore, we show that fault tolerance is influenced by the training algorithm used and even the initial state of the network. Using an idea due to Sequin and Clay (1990), we show that training with intermittent, randomly selected faults can dramatically enhance the fault tolerance of radial basis function networks, while it yields only marginal improvement when used with multilayer perceptrons.

[1] G. Bolt, J. Austin, and G. Morgan, "Fault tolerance in neural networks," Tech. Rep., Dep. of Comput. Sci., Univ. of York, Heslington, York, YO1 5DD, U.K., 1990.
[2] D. A. Carrara, "Three techniques for improving operational fault tolerance in multi-layer perceptrons," M.S. thesis, ECE Dep., Univ. of New Hampshire, Durham, NH, 1993.
[3] M. J. Carter, "The illusion of fault tolerance in neural networks for pattern recognition and signal processing," inProc. Tech. Session on Fault-Tolerant Integrated Systems, Univ. of New Hampshire, Durham, NH, 1988.
[4] L. C. Chu and B. W. Wah, "Fault tolerant neural networks with hybrid redundancy," inProc. IJCNN, San Diego, CA, June 1990, pp. II-639-649.
[5] M. J. Dzwonczyk, "Quantitative failure models of feed-forward neural networks," M.S. thesis, Dep. of Aeronautics and Astronautics, MIT, Cambridge, MA, Feb. 1991.
[6] S. E. Fahlman, "Faster-learning variations on back-propagation: An empirical study," inProceedings of the 1988 Connectionist Models Summer School. San Mateo, CA: Morgan Kaufmann, 1988.
[7] Y. Izui and A. Pentland, "Analysis of neural networks with redundancy,"Neural Computation, vol. 2, pp. 226-238, 1990.
[8] C. Neti, M. Schneider, and E. Young, "Maximally fault tolerant neural networks and nonlinear programming," inProc. Int. Joint Conf. on Neural Netw., San Diego, CA, June 1990, pp. II-483-496.
[9] J. Nijhuis, B. Hofflinger, A. van Schaik, and L. Spaanenburg, "Limits to the fault-tolerance of a feedforward neural network with learning," in theProc. 20th IEEE Fault Tolerant Computing Symp., 1990.
[10] D. B. Parker, "Learning logic," Tech. Rep. TR-47, Center for Computational Res. in Economics and Management Sci., MIT Cambridge, MA, 1985.
[11] D. S. Phatak and I. Koren, "A study of fault tolerance properties of artificial neural nets," Tech. Rep., Elec. and Comput. Eng. Dep., Univ. of Massachusetts, Amherst, 1991.
[12] T. Poggio and F. Girosi, "Networks for approximation and learning,"Proc. IEEE, vol. 78, pp. 1481-1497, Sept. 1990.
[13] D.E. Rumelhart and D. McClelland, eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1-2, MIT Press, Cambridge, Mass., 1986.
[14] D. E. Rumelhart, "Learning and generalization," transcript of plenary address appears inProc. IEEE ICNNSan Diego, CA, 1988.
[15] B. E. Segee and M. J. Carter, "Fault tolerance of pruned multilayer networks," inProc. IJCNN, Seattle, WA, 1991, pp. II-447-452.
[16] B. E. Segee and M. J. Carter, "Fault sensitivity and nodal relevance relationships in multi-layer perceptrons," Tech. Rep. ECE.IS.90.02, Dep. of Elec. and Comput. Eng., Univ. of New Hampshire, Durham, NH, Mar. 1990.
[17] C. H. S eacute;quin and R. D. Clay, "Fault tolerance in artificial neural networks," inProc. IJCNN, San Diego, CA, June 1990, pp. I-703-708.
[18] R. L. Watrous, "Learning algorithms for connectionist networks: Applied gradient methods of nonlinear optimization," inProc. IEEE Int. Conf. Neural Netw., San Diego, CA, 1987, vol. II, pp. 619-627.
[19] P. J. Werbos, "Beyond regression: New tools for prediction and analysis in the behavioral sciences," Ph.D. dissertation, Harvard Univ., Cambridge, MA, 1974.

Index Terms:
feedforward neural nets; fault tolerant computing; backpropagation; fault tolerance; parallel distributed processing networks; artificial neural networks; computing networks; biological neural networks; multilayer perceptrons; Gaussian radial basis function networks; training algorithm; backpropagation; function approximation; neural networks; robustness.
Citation:
B.E. Segee, M.J. Carter, "Comparative Fault Tolerance of Parallel Distributed Processing Networks," IEEE Transactions on Computers, vol. 43, no. 11, pp. 1323-1329, Nov. 1994, doi:10.1109/12.324565
Usage of this product signifies your acceptance of the Terms of Use.