This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Comparative Analysis of Backpropagation and the Extended Kalman Filter for Training Multilayer Perceptrons
June 1992 (vol. 14 no. 6)
pp. 686-691

The relationship between backpropagation and extended Kalman filtering for training multilayer perceptrons is examined. These two techniques are compared theoretically and empirically using sensor imagery. Backpropagation is a technique from neural networks for assigning weights in a multilayer perceptron. An extended Kalman filter can also be used for this purpose. A brief review of the multilayer perceptron and these two training methods is provided. Then, it is shown that backpropagation is a degenerate form of the extended Kalman filter. The training rules are compared in two examples: an image classification problem using laser radar Doppler imagery and a target detection problem using absolute range images. In both examples, the backpropagation training algorithm is shown to be three orders of magnitude less costly than the extended Kalman filter algorithm in terms of a number of floating-point operations.

[1] D. B. Parker, "Learning-logic," Invention Rep. 581-64, Stanford Univ., Stanford, CA, Oct. 1982.
[2] D.E. Rumelhart and D. McClelland, eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1-2, MIT Press, Cambridge, Mass., 1986.
[3] P. J. Werbos, "Beyond regression: New tools for prediction and analysis in the behavioral sciences," Ph.D. thesis, Harvard Univ., Cambridge, MA, 1974.
[4] S. Singhal and L. Wu, "Training multilayer perceptrons with the extended Kalman algorithm, " inAdvances in Neural Information Processing systems 1(D. S. Touretzky, Ed.). San Mateo, CA: Morgan Kaufmann, 1989, pp. 133-140.
[5] A. Rosenblatt,Principles of Neurodynamics. New York: Spartan, 1959.
[6] G. Cybenko, "Approximations by superpositions of sigmoidal functions,"Math. Cont. Signals Syst., 1989.
[7] K. Hornik, M. Stinchcombe, and H. White, "Multilayer feedforward networks are universal approximators,"Neural Networks, vol. 2, pp. 359-366, 1989.
[8] S.-C. Huang and Y. -F. Huang, "Bounds on the number of hidden neurons in multilayer perceptrons,"IEEE Trans. Neural Networks, vol. 2, pp. 47-55, Jan. 1991.
[9] Y. Le Gun, J. S. Denker, and S. A. Solla, "Optimal brain damage," inAdvances in Neural Information Processing systems 2(D. S. Touretzky, Ed.). San Mateo, CA; Morgan Kaufmann, 1990, pp. 598-605.
[10] W. E. Simon and J. R. Carter, "Removing and adding network connections with recursive error minimization (rem) equations," inProc. SPIE Conf. Applications Artificial Neural Networks(Orlando, FL), Apr. 1990, vol. 1294.
[11] N. J. Nilsson,Learning Machines. New York: McGraw-Hill, 1965.
[12] M. L. Minsky and S. A. Papert,Perceptrons. Cambridge, MA: MIT Press, 1969.
[13] R. Winter and B. Widrow, "MADALINE RULE II: A training rule for neural networks," inProc. IEEE/INNS Joint Conf. Neural Networks(San Diego, CA), July 1988, pp. 1.401-1.408.
[14] H. White, "Learning in artificial neural networks: A statistical perspective,"Neural Computation, vol. 1, pp. 425-464, 1989.
[15] H. Robbins and S. Monro, "A stochastic approximation method,"Annals Math. Stat., vol. 22, pp. 400-407, 1951.
[16] P. S. Maybeck,Stochastic Models, Estimation, and Control, vol. 1. New York: Academic, 1979.
[17] P. S. Maybeck,Stochastic Models, Estimation, and Control, vol. 2. New York: Academic, 1982.
[18] D. W. Ruck, "Characterization of multilayer perceptrons and their application to multisensor target detection," Ph.D. thesis, Air Force Inst. Technol., Wright-Patterson AFB, OH, Dec. 1990 (DTIC: ADA229035) (UMF: DA9109311).
[19] R. A. Jacobs, "Increased rates of convergence through learning rate adaptation,"Neural Networks, vol. 1, no. 4, pp. 295-307, 1988.
[20] D. W. Ruck, "Multisensor target detection and classification," Master' s thesis, Air Force Inst. Technol., 1987 (DTIC: ADA235449).
[21] D. W. Ruck, S. K. Rogers, M. Kabrisky, and J. P. Mills, "Multisensor fusion target classification," inProc. SPIE Symp. Optics, Electro-Optics, Sensors, 1988, pp. 14-21.

Index Terms:
backpropagation; extended Kalman filter; training; multilayer perceptrons; sensor imagery; neural networks; image classification; laser radar Doppler imagery; target detection; absolute range images; Kalman filters; neural nets; pattern recognition
Citation:
D.W. Ruck, S.K. Rogers, M. Kabrisky, P.S. Maybeck, M.E. Oxley, "Comparative Analysis of Backpropagation and the Extended Kalman Filter for Training Multilayer Perceptrons," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 6, pp. 686-691, June 1992, doi:10.1109/34.141559
Usage of this product signifies your acceptance of the Terms of Use.