
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
V. Cherkassky, K. Fassett, N. Vassilas, "Linear Algebra Approach to Neural Associative Memories and Noise performance of Neural Classifiers," IEEE Transactions on Computers, vol. 40, no. 12, pp. 14291435, December, 1991.  
BibTex  x  
@article{ 10.1109/12.106229, author = {V. Cherkassky and K. Fassett and N. Vassilas}, title = {Linear Algebra Approach to Neural Associative Memories and Noise performance of Neural Classifiers}, journal ={IEEE Transactions on Computers}, volume = {40}, number = {12}, issn = {00189340}, year = {1991}, pages = {14291435}, doi = {http://doi.ieeecomputersociety.org/10.1109/12.106229}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Computers TI  Linear Algebra Approach to Neural Associative Memories and Noise performance of Neural Classifiers IS  12 SN  00189340 SP1429 EP1435 EPD  14291435 A1  V. Cherkassky, A1  K. Fassett, A1  N. Vassilas, PY  1991 KW  linear algebra approach; neural associative memories; noise performance; neural classifiers; analytic evaluation; saturation; comparative analysis; correlation matrix memory; generalized inverse memory construction rules; contentaddressable storage; linear algebra; neural nets; performance evaluation. VL  40 JA  IEEE Transactions on Computers ER   
The authors present an analytic evaluation of saturation and noise performance for a large class of associative memories based on matrix operations. The importance of using standard linear algebra techniques for evaluating noise performance of associative memories is emphasized. The authors present a detailed comparative analysis of the correlation matrix memory and the generalized inverse memory construction rules for autoassociative memory and neural classifiers. Analytic results for the noise performance of neural classifiers that can store several prototypes in one class are presented. The analysis indicates that for neural classifiers the simple correlation matrix memory provides better noise performance than the more complex generalized inverse memory.
[1] M. Minsky and S. Papert,Perceptrons: An Introduction to Computational Geometry. Cambridge, MA: M.I.T. Press, 1969.
[2] J. L. McClelland and D. E. Rumelhart, Eds.,Parallel Distributed Processing, vol. 1, 2. Cambridge, MA: M.I.T. Press, 1986.
[3] C. L. Giles, and T. Maxwell, "Learning and generalization in higher order networks,"Appl. Opt., vol. 26, no. 23, 4972, 1987.
[4] T. Maxwell, "Pattern recognition and single layer networks," inEvolution, Learning and Cognition, Y. C. Lee, Ed. World Scientific, 1988, pp. 347371.
[5] Y. H. Pao,Adaptive Pattern Recognition and Neutral Networks, Reading, MA: AddisonWesley, 1989.
[6] J. M. Char, V. Cherkassky, H. Wechsler, and G. L. Zimerman, "Distributed and faulttolerant computation for retrieval tasks,"IEEE Trans. Comput., vol. C37, no. 4, pp. 484490, 1988.
[7] V. Cherkassky, N. Vassilas, G. L. Brodt, and H. Wechsler, "Conventional and associative memory approaches to spelling correction, under review.
[8] F. Rosenblatt,Principles of Neurodynamics. New York: Spartan Books, 1962.
[9] J. A. Anderson, "A simple neural network generating an interactive memory,"Math. Biosci., vol. 14, pp. 197220, 1972.
[10] J. J. Hopfield, "Neural networks and physical systems with emergent collective computational abilities,"Proc. Nat. Acad. Sci., vol. 79, pp. 25542558, 1982.
[11] T. Kohonen,SelfOrganization and Associative Memory. Berlin, Germany: SpringerVerlag, 1988, p. 132.
[12] D. Psaltis and S. S. Venkatesh "Information storage in fully connected networks," inEvolution, Learning and Cognition, Y. C. Lee, Ed. World Scientific, 1988, pp. 5190.
[13] R. P. Lippman, "An introduction to computing with neural nets,"IEEE ASSP Msg., vol. 4, pp. 422, 1987.
[14] E. B. Baum, J. Moody, and F. Wilczek, "Internal representations for associative memory,"Biol. Cybern., vol. 59, pp. 217228, 1988.
[15] G. S. Stiles and D. L. Denq, "On the effect of noise in the MoorePenrose generalized inverse associative memory,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI7, no. 3, pp. 358360, 1985.
[16] D. Casasent and B. Telfer, "Key and recollection vector effects on heteroassociative memory performance,"Appl. Opt., vol. 28, no. 2, pp. 272283, 1989.
[17] J. Komlos, "On the determinant of (0,1) matrices,"Studia Scientarium Mathematicarum Hungarica, vol. 2, pp. 721, 1967.
[18] R. Bellman,Introduction to Matrix Analysis. New York: McGrawHill, 1960, p. 56.
[19] G. S. Stiles and D. L. Denq, "A quantitative comparison of the performance of the three discrete distributed associative memory models,"IEEE Trans. Comput., vol. C36, no. 3, pp. 257263, 1987.
[20] V. Cherkassky, N. Vassilas, and H. Wechsler, "A hierarchical neural model for semantic categorization,"Int. J. Neural Networks Res. Appl., Learned Information, Oxford, to be published.
[21] N. Vassilas, "Performance of neural associative memories for character recognition and associative database retrieval," Ph.D. dissertation, Dep. EE, Univ. of Minnesota, 1990.
[22] G. Golub and C. F. Van Loan,Matrix Computations. Baltimore, MD: The Johns Hopkins University Press, 1983.
[23] T. Cover and P. Hart, "Nearest neighbor pattern classification,"IEEE Trans. Inform. Theory, vol. IT13, pp. 2127, 1967.