Issue No. 12 - December (1991 vol. 40)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/12.106229
<p>The authors present an analytic evaluation of saturation and noise performance for a large class of associative memories based on matrix operations. The importance of using standard linear algebra techniques for evaluating noise performance of associative memories is emphasized. The authors present a detailed comparative analysis of the correlation matrix memory and the generalized inverse memory construction rules for auto-associative memory and neural classifiers. Analytic results for the noise performance of neural classifiers that can store several prototypes in one class are presented. The analysis indicates that for neural classifiers the simple correlation matrix memory provides better noise performance than the more complex generalized inverse memory.</p>
linear algebra approach; neural associative memories; noise performance; neural classifiers; analytic evaluation; saturation; comparative analysis; correlation matrix memory; generalized inverse memory construction rules; content-addressable storage; linear algebra; neural nets; performance evaluation.
K. Fassett, N. Vassilas and V. Cherkassky, "Linear Algebra Approach to Neural Associative Memories and Noise performance of Neural Classifiers," in IEEE Transactions on Computers, vol. 40, no. , pp. 1429-1435, 1991.