This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Comments on "Comparative Analysis of Backpropagation and the Extended Kalman Filter for Training Multilayer Perceptrons"
August 1994 (vol. 16 no. 8)
pp. 862-863

In this note, the connection between the backpropagation algorithm and the extended Kalman filter is analyzed using an alternate form of representation for the Kalman gain term and shown to be much simpler than that reported by Ruck et al (1992).

[1] D. W. Ruck, S. K. Rogers, M. Kabrisky, P. S. Maybeck, and M. E. Oxley, "Comparative analysis of backpropagation and extended Kalman filter for training multilayer perceptrons,"IEEE Trans. Pattern Anal. Machine Intell., vol. 14, no. 6, pp. 686-691, 1992.
[2] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning representations by back propagating errors,"Nature, vol. 323, pp. 533-536, 1986.
[3] S. Singhal and L. Wu, "Training multilayer perceptrons with the extended Kalman algorithm, " inAdvances in Neural Information Processing systems 1(D. S. Touretzky, Ed.). San Mateo, CA: Morgan Kaufmann, 1989, pp. 133-140.
[4] R. F. Stengel,Stochastic Optimal Control. New York: John Wiley, 1986.
[5] O.L.R. Jacobs,Introduction to Control Theory. Oxford: Oxford Univ. Press, 1974.

Index Terms:
backpropagation; Kalman filters; feedforward neural nets; comparative analysis; backpropagation; extended Kalman filter; training; multilayer perceptrons; Kalman gain
Citation:
P. Sarat Chandran, "Comments on "Comparative Analysis of Backpropagation and the Extended Kalman Filter for Training Multilayer Perceptrons"," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16, no. 8, pp. 862-863, Aug. 1994, doi:10.1109/34.308485
Usage of this product signifies your acceptance of the Terms of Use.