The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.01 - January (1992 vol.14)
pp: 76-86
ABSTRACT
<p>The authors propose a theoretical framework for backpropagation (BP) in order to identify some of its limitations as a general learning procedure and the reasons for its success in several experiments on pattern recognition. The first important conclusion is that examples can be found in which BP gets stuck in local minima. A simple example in which BP can get stuck during gradient descent without having learned the entire training set is presented. This example guarantees the existence of a solution with null cost. Some conditions on the network architecture and the learning environment that ensure the convergence of the BP algorithm are proposed. It is proven in particular that the convergence holds if the classes are linearly separable. In this case, the experience gained in several experiments shows that multilayered neural networks (MLNs) exceed perceptrons in generalization to new examples.</p>
INDEX TERMS
learning systems; local minima; backpropagation; pattern recognition; network architecture; convergence; multilayered neural networks; perceptrons; learning systems; neural nets; pattern recognition
CITATION
M. Gori, "On the Problem of Local Minima in Backpropagation", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.14, no. 1, pp. 76-86, January 1992, doi:10.1109/34.107014
25 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool