The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.12 - December (1991 vol.40)
pp: 1380-1389
ABSTRACT
<p>The effects of silicon implementation on the backpropagation learning rule in artificial neural systems are examined. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20/22 bits is generally required, but this figure can be reduced to about 14/15 bits by properly choosing the learning parameter eta which attains good performance in presence of limited resolution. This performance can be further improved by using a modified batch backpropagation rule. Theoretical analysis is compared with ad-hoc simulations and results are discussed in detail.</p>
INDEX TERMS
VLSI; performance; silicon implementations; backpropagation algorithms; artificial neural networks; learning rule; performance; limited weight resolution; range limitations; steepness; activation function; simulations; Si; artificial intelligence; learning systems; neural nets; VLSI.
CITATION
L.M. Reyneri, E. Filippi, "An Analysis on the Performance of Silicon Implementations of Backpropagation Algorithms for Artificial Neural Networks", IEEE Transactions on Computers, vol.40, no. 12, pp. 1380-1389, December 1991, doi:10.1109/12.106223
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool