This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Dimensionality-Reduction Using Connectionist Networks
March 1989 (vol. 11 no. 3)
pp. 304-314

A method is presented for using connectionist networks of simple computing elements to discover a particular type of constraint in multidimensional data. Suppose that some data source provides samples consisting of n-dimensional feature-vectors, but that this data all happens to lie on an m-dimensional surface embedded in the n-dimensional feature space. Then occurrences of data can be more concisely described by specifying an m-dimensional location of the embedded surface than by reciting all n components of the feature vector. The recording of data in such a way is known as dimensionality-reduction. A method is presented for performing dimensionality-reduction in a wide class of situations for which an assumption of linearity need not be made about the underlying constraint surface. The method takes advantage of self-organizing properties of connectionist networks of simple computing elements. The authors present a scheme for representing the values of continuous (scalar) variables in subsets of units.

[1] D. Ackley, G. Hinton, and T. Sejnowski, "A learning algorithm for Boltzmann machines,"Cognitive Sci., vol. 9, pp. 147-169, 1985.
[2] D. Ballard, "Cortical connections and parallel processing: Structure and function,"Behavior Brain Sci., vol. 9, pp. 67-120, 1986.
[3] K. Fukunaga and W. Koontz, "Application of the Karhunen-Loève expansion to feature selection and ordering,"IEEE Trans. Comput., vol. C-19, pp. 311-318, 1970.
[4] S. Hanson and D. Burr, "Knowledge representation in connectionist networks," Tech. Rep., Bell Commun. Res. Morristown, NJ, 1987.
[5] G. Hinton, T. Sejnowski, and D. Ackley, "Boltzmann machines: Constraint satisfaction networks that learn," Tech. Rep. CMU-CS- 84-119, Carnegie-Mellon University, Pittsburgh, PA, 1984.
[6] J. Hopfield and D. Tank, "Neural computation in optimization problems,"Biol. Cybern., 1985.
[7] S. Kirkpatrick, S. Gelatt, and M. Vecchi, "Optimization by simulated annealing,"Science, vol. 220, pp. 671-680, 1983.
[8] J. Kittler and P. Young, "A new application to feature selection based on the Karhunen-Loève expansion,"Pattern Recog., vol. 5, pp. 335- 352, 1973.
[9] C. Koch, J. Marroquin, and A. Yuille, "Analog 'neural' networks in early vision," M.I.T. AI Memo 751, M.I.T., Cambridge, 1985.
[10] T. Kohonen,Self-Organization and Associative Memory. Berlin, Germany: Springer-Verlag, 1988, p. 132.
[11] Krishnaiah and Kanal, Eds.,Handbook on Statistics, Vol. 2: Classification, Pattern Recognition, and Reduction of Dimensionality, North-Holland, 1982.
[12] M. Minsky and S. Papert,Perceptrons. M.I.T. Press, Cambridge, MA, 1969.
[13] C. Rosenberg, "Revealing the structure of NETtalk's internal representations," inProc. Ninth Ann. Conf. Cognitive Sci. Soc., Seattle, WA, 1987, pp. 537-554.
[14] F. Rosenblatt,Principles of Neurodynamics. New York: Spartan, 1962.
[15] D. Rumelhart, G. Hinton, and R. Williams, "Learning internal representations by error propagation," ICS Rep. 8506,Instit. Cognitive Sci., UCSD, 1985.
[16] D. Rumelhart, G. Hinton, and R. Williams, "Learning internal representations by back-propagating errors,"Nature, vol. 323, pp. 533-536, 1986.
[17] D.E. Rumelhart and D. McClelland, eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1-2, MIT Press, Cambridge, Mass., 1986.
[18] E. Saund, "Abstraction and representation of continuous variables in connectionist networks," inProc. Fifth Nat. Conf. Artificial Intell., pp. 638-644, 1986.
[19] J. Tou and R. Heydorn, "Some optimal approaches to feature extraction," inComputer and Information Sciences II, Tou, Ed. New York: Academic, 1967.
[20] D. Walters, "Properties of connectionist variable representations," inProc. Ninth Annu. Conf. Cognitive Sci. Soc., Seattle, WA, 1987, pp. 265-273.
[21] S. Watanabe, "Karhunen-Loève expansion and factor analysis,"Trans. 4th Prague Conf. Inform. Theory, 1965.

Index Terms:
pattern recognition; data abstraction; backpropagation; connectionist networks; multidimensional data; feature-vectors; feature space; dimensionality-reduction; artificial intelligence; computerised pattern recognition
Citation:
E. Saund, "Dimensionality-Reduction Using Connectionist Networks," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 11, no. 3, pp. 304-314, March 1989, doi:10.1109/34.21799
Usage of this product signifies your acceptance of the Terms of Use.