This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 2
Training Feedforward Neural Networks Using Orthogonal Iteration of the Hessian Eigenvectors
Como, Italy
July 24-July 27
ISBN: 0-7695-0619-4
Andrew Hunter, University of Sunderland
The paper describes a new training algorithm that has scalable memory requirements, which may range horn O(W) to O(W2), although in practice the useful range is limited to lower complexity levels. The algorithm is based upon a novel iterative estimation of the principal eigen-subspace of the Hessian, together with a quadratic step estimation procedure. It is shown that the new algorithm has convergence time comparable to conjugate gradient descent and maybe preferable if early stopping is used as it converges more quickly during the initial phases. Section 2 overviews the principles of second order training algorithms. Section 3 introduces the new algorithm. Second 4 discusses some experiments to confirm the algorithm's performance; section 5 concludes the paper.
Citation:
Andrew Hunter, "Training Feedforward Neural Networks Using Orthogonal Iteration of the Hessian Eigenvectors," ijcnn, vol. 2, pp.2173, IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 2, 2000
Usage of this product signifies your acceptance of the Terms of Use.