The Community for Technology Leaders
RSS Icon
Subscribe
Como, Italy
July 24, 2000 to July 27, 2000
ISBN: 0-7695-0619-4
pp: 2173
Andrew Hunter , University of Sunderland
ABSTRACT
The paper describes a new training algorithm that has scalable memory requirements, which may range horn O(W) to O(W2), although in practice the useful range is limited to lower complexity levels. The algorithm is based upon a novel iterative estimation of the principal eigen-subspace of the Hessian, together with a quadratic step estimation procedure. It is shown that the new algorithm has convergence time comparable to conjugate gradient descent and maybe preferable if early stopping is used as it converges more quickly during the initial phases. Section 2 overviews the principles of second order training algorithms. Section 3 introduces the new algorithm. Second 4 discusses some experiments to confirm the algorithm's performance; section 5 concludes the paper.
CITATION
Andrew Hunter, "Training Feedforward Neural Networks Using Orthogonal Iteration of the Hessian Eigenvectors", IJCNN, 2000, Neural Networks, IEEE - INNS - ENNS International Joint Conference on, Neural Networks, IEEE - INNS - ENNS International Joint Conference on 2000, pp. 2173, doi:10.1109/IJCNN.2000.857893
15 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool