This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Representation and Processing of Structures with Binary Sparse Distributed Codes
March/April 2001 (vol. 13 no. 2)
pp. 261-276

Abstract—The schemes for compositional distributed representations include those allowing on-the-fly construction of fixed dimensionality codevectors to encode structures of various complexity. Similarity of such codevectors takes into account both structural and semantic similarity of represented structures. In this paper, we provide a comparative description of sparse binary distributed representation developed in the framework of the associative-projective neural network architecture and the more well-known holographic reduced representations of Plate and binary spatter codes of Kanerva. The key procedure in associative-projective neural networks is context-dependent thinning which binds codevectors and maintains their sparseness. The codevectors are stored in structured memory array which can be realized as distributed auto-associative memory. Examples of distributed representation of structured data are given. Fast estimation of the similarity of analogical episodes by the overlap of their codevectors is used in the modeling of analogical reasoning both for retrieval of analogs from memory and for analogical mapping.

[1] S. Amari, “Characteristics of Sparsely Encoded Associative Memory,” Neural Networks, vol. 2, pp. 445-457, 1989.
[2] L.W. Barsalou, “Perceptual Symbol Systems,” Behavioral and Brain Sciences, vol. 22, pp. 577-609, 1999.
[3] E.B. Baum, J. Moody, and F. Wilczek, “Internal Representations for Associative Memory,” Biological Cybernetics 59, pp. 217-228, 1988.
[4] S. Deerwester, S.T. Dumais, G.W. Furnas, T.K. Landauer, and R.A. Harshman, “Indexing by Latent Semantic Analysis,” J. Am. Soc. for Information Science, vol. 41, no. 6, pp. 391-407, 1990.
[5] G. Dorffner and E. Prem, “Connectionism, Symbol Grounding, and Autonomous Agents,” Technical Report TR-93-17, Austrian Research Inst. for AI, 1993.
[6] C. Eliasmith and P. Thagard, “Integrating Structure and Meaning: A Distributed Model of Analogical Mapping,” Cognitive Science, vol. 25, no. 1, 2001.
[7] B. Falkenhainer, K. Forbus, and D. Gentner, "The Structure-Mapping Engine: Algorithm and Examples," Artificial Intelligence, Vol. 41, 1989, pp. 1-63.
[8] J.A. Feldman, “Neural Representation of Conceptual Knowledge,” Neural Connections, Mental Computation, L. Nadel, L.A. Cooper, P. Culicover, and R.M. Harnish, eds., pp. 68-103, 1989.
[9] K.D. Forbus, D. Gentner, and K. Law, “MAC/FAC: A Model of Similarity-Based Retrieval,” Cognitive Science, vol. 19, no. 2, pp. 141-205, 1995.
[10] P. Frasconi, M. Gori, and A. Sperduti, “A General Framework for Adaptive Processing of Data Structures,” Technical Report DSI- RT-15/97, Universita degli Studi di Firenze, Dipartimento di Sistemi e Informatica, Firenze, Italy, 1997.
[11] A.A. Frolov, D. Husek, and I.P. Muraviev, “Information Capacity and Recall Quality in Sparsely Encoded Hopfield-like Neural Network: Analytical Approaches and Computer Simulation,” Neural Networks, vol. 10, pp. 845-855, 1997.
[12] R.W. Gayler, “Multiplicative Binding, Representation Operators, and Analogy,” Advances in Analogy Research: Integration of Theory and Data from the Cognitive, Computational, and Neural Sciences, K. Holyoak, D. Gentner, and B. Kokinov, eds., Sofia, Bulgaria: New Bulgarian University, p. 405, 1998 (poster abstract; full poster available at:http://cogprints.soton.ac.uk/abs/comp199807020 ).
[13] D. Gentner, “Structure-Mapping: A Theoretical Framework for Analogy,” Cognitive Science, vol. 7, pp 155-170, 1983.
[14] D. Gentner and A.B. Markman, “Analogy-Based Reasoning,” Handbook of Brain Theory and Neural Networks, M.A. Arbib, ed., pp. 91-93, Cambridge, Mass.: MIT Press, 1995.
[15] D. Gentner and A.B. Markman, “Structure Mapping in Analogy and Similarity,” Am. Psychologist, vol. 52, no. 1, pp. 45-56, 1997.
[16] B. Gray, G.S. Halford, W.H. Wilson, and S. Phillips, “A Neural Net Model for Mapping Hierarchically Structured Analogs,” Proc. Fourth Conf. Australasian Cognitive Science Soc., Sept. 1997.
[17] G.S. Halford, W.H. Wilson, and S. Phillips, “Processing Capacity Defined by Relational Complexity: Implications for Comparative, Developmental, and Cognitive Psychology,” Behavioral and Brain Sciences, vol. 21, pp. 723-802, 1998.
[18] S. Harnad, "The Symbol Grounding Problem," Physica D, vol. 42, 1990, pp. 335-346; available online at(current 10 Sept. 2001).
[19] D.O. Hebb, The Organization of Behavior. New York: Wiley, 1949.
[20] G.E. Hinton, “Mapping Part-Whole Hierarchies into Connectionist Networks,” Artificial Intelligence, vol. 46, pp. 47–75, 1990.
[21] D.R. Hofstadter and M. Mitchell, “Conceptual Slippage and Mapping: A Report of the Copycat Project,” Proc. 10th Ann. Conf. Cognitive Science Soc., pp. 601-607, 1988.
[22] K.J. Holyoak and P. Thagard, “Analogical Mapping by Constraint Satisfaction,” Cognitive Science, vol. 13, pp. 295-355, 1989.
[23] J.J. Hopfield, D.I. Feinstein, and R.G. Palmer, “Unlearning has a Stabilizing Effect in Collective Memories,” Nature, vol. 304, pp. 158-159, 1983.
[24] J.E. Hummel and K.J. Holyoak, “Distributed Representations of Structure: A Theory of Analogical Access and Mapping,” Psychological Rev., vol. 104, pp. 427-466, 1997.
[25] P. Kanerva, Sparse Distributed Memory. MIT Press, 1988.
[26] P. Kanerva, “Binary Spatter-Coding of Ordered K-Tuples,” Proc. Int'l Conf. Artificial Neural Networks—ICANN '96, C. von der Malsburg, W. von Seelen, J.C. Vorbruggen, and B. Sendhoff, eds., pp. 869-873, 1996.
[27] P. Kanerva, “Fully Distributed Representation,” Proc. 1997 Real World Computing Symp. (RWC '97), pp. 358-365, 1997.
[28] P. Kanerva, “Dual Role of Analogy in the Design of a Cognitive Computer,” Advances in Analogy Research: Integration of Theory and Data from the Cognitive, Computational, and Neural Sciences (workshop proc. Analogy '98), K. Holyoak, D. Gentner, and B. Kokinov, eds., pp. 164-170, 1998.
[29] E.M. Kussul, Associative Neuron-Like Structures. Kiev: Naukova Dumka, 1992 (in Russian).
[30] E.M. Kussul and T.N. Baidyk, “Design of a Neural-Like Network Architecture for Recognition of Object Shapes in Images,” Soviet J. Automation and Information Sciences, vol. 23, no. 5, pp. 53-58, 1990.
[31] E.M. Kussul and T.N. Baidyk, “On Information Encoding in Associative-Projective Neural Networks,” (Preprint 93-3). Kiev, Ukraine: V.M. Glushkov Inst. of Cybernetics, 1993 (in Russian).
[32] E.M. Kussul and D.A. Rachkovskij, “Multilevel Assembly Neural Architecture and Processing of Sequences,” Neurocomputers and Attention: Vol. II Connectionism and Neurocomputers, A.V. Holden and V.I. Kryukov, eds., pp. 577- 590, 1991.
[33] E.M. Kussul, D.A. Rachkovskij, and T.N. Baidyk, “Associative-Projective Neural Networks: Architecture, Implementation, Applications,” Proc. Fourth Int'l Conf. Neural Networks and Their Applications, pp. 463-476, Nov. 1991.
[34] A. Lansner and O. Ekeberg, “Reliability and Speed of Recall in an Associative Network,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 7, pp. 490-498, 1985.
[35] A.B. Markman, “Structural Alignment in Similarity and Its Influence on Category Structure,” Cognitive Studies, vol. 4, no. 4, pp. 19-37, 1997.
[36] P.M. Milner, “A Model for Visual Shape Recognition,” Psychological Rev., vol. 81, pp. 521-535, 1974.
[37] G. Palm, “On Associative Memory,” Biological Cybernetics, vol. 36, pp. 19-31, 1980.
[38] G. Palm and T. Bonhoeffer, “Parallel Processing for Associative and Neuronal Networks,” Biological Cybernetics, vol. 51, pp. 201-204, 1984.
[39] T.A. Plate, “Estimating Structural Similarity by Vector Dot Products of Holographic Reduced Representations,” Advances in Neural Information Processing Systems 6 (NIPS '93), J.D. Cowan, G. Tesauro, and J. Alspector, eds., pp. 1109-1116, 1994.
[40] T.A. Plate, “Holographic Reduced Representations,” IEEE Trans. Neural Networks, vol. 6, no. 3, pp. 623–641, 1995.
[41] T. Plate, “A Common Framework for Distributed Representation Schemes for Compositional Structure,” Connectionist Systems for Knowledge Representation and Deduction, F. Maire, R. Hayward, and J. Diederich, eds., pp. 15-34, 1997.
[42] T.A. Plate, “Analogy Retrieval and Processing with Distributed Vector Representations,” Expert Systems: The Int'l J. Knowledge Eng. and Neural Networks, special issue on Connectionist Symbol Processing, vol. 17, no. 1, pp. 29-40, 2000.
[43] J.B. Pollack, “Recursive Distributed Representations,” Artificial Intelligence, vol. 46, nos. 1-2, pp. 77–106, 1990.
[44] D.A. Rachkovskij, “On Numerical-Analytical Investigation of Neural Network Characteristics,” Neuron-Like Networks and Neurocomputers, pp. 13-23, Kiev, Ukraine: V.M. Glushkov Inst. of Cybernetics, 1990 (in Russian).
[45] D.A. Rachkovskij, “Development and Investigation of Multilevel Assembly Neural Networks,” Unpublished PhD dissertation, Kiev, Ukrainian SSR: V.M. Glushkov Inst. of Cybernetics, 1990 (in Russian).
[46] D.A. Rachkovskij and E.M. Kussul, “Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning,” Neural Computation, vol. 13, no. 2, pp. 411-452, 2001 (paper draft available at, ID code: cog00001240).
[47] B. Ross, “Distinguishing Types of Superficial Similarities: Different Effects on the Access and Use of Earlier Problems,” J. Experimental Psychology: Learning, Memory, and Cognition, vol. 15, pp. 456-468, 1989.
[48] L. Shastri and V. Ajjanagadde, “From Simple Associations to Systematic Reasoning: Connectionist Representation of Rules, Variables, and Dynamic Bindings using Temporal Synchrony,” Behavioral and Brain Sciences, vol. 16, pp. 417-494, 1993.
[49] G. Sjodin, “The Sparchunk Code: A Method to Build Higher-Level Structures in a Sparsely Encoded SDM,” Proc. Int'l Joint Conf. Neural Networks '98, pp. 1410-1415, 1998.
[50] G. Sjodin, P. Kanerva, B. Levin, and J. Kristoferson, “Holistic Higher-Level Structure-Forming Algorithms,” Proc. 1998 Real World Computing Symp.—RWC '98, pp. 299-304, 1998.
[51] P. Smolensky, “Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems,” Artificial Intelligence, vol. 46, pp. 159–216, 1990.
[52] A. Sperduti, “Labeling RAAM,” Connection Science, vol. 6, pp. 429- 459, 1994.
[53] P. Thagard, K.J. Holyoak, G. Nelson, and D. Gochfeld, “Analog Retrieval by Constraint Satisfaction,” Artificial Intelligence, vol. 46, pp. 259-310, 1990.
[54] M.V. Tsodyks, “Associative Memory in Neural Networks with the Hebbian Learning Rule,” Modern Physics Letters B, vol. 3, pp. 555-560, 1989.
[55] D.S. Touretzky, “BoltzCONS: Dynamic Symbol Structures in a Connectionist Network,” Artificial Intelligence, vol. 46, pp. 5-46, 1990.
[56] A.A. Vedenov, “Spurious Memory,” Model Neural Networks, Moscow: I.V. Kurchatov Inst. of Atomic Energy (preprint IAE-4395/1), 1987.
[57] C. von der Malsburg, “Am I Thinking Assemblies?” Proc. 1984 Trieste Meeting on Brain Theory, G. Palm and A. Aertsen, eds., pp. 161-176, 1986.
[58] C.M. Wharton, K.J. Holyoak, P.E. Downing, T.E. Lange, T.D. Wickens, and E.R. Melz, “Below the Surface: Analogical Similarity and Retrieval Competition in Reminding,” Cognitive Psychology, vol. 26, pp. 64-101, 1994.
[59] D.J. Willshaw, O.P. Buneman, and H.C. Longuet-Higgins, “Non-Holographic Associative Memory,” Nature, vol. 222, pp. 960-962, 1969.

Index Terms:
Sparse coding, binary coding, binding, representation of structure, hierarchical representation, nested representation, long-term memory, analogy, compositional distributed representations, connectionist symbol processing, analogical retrieval, analogical mapping.
Citation:
Dmitri A. Rachkovskij, "Representation and Processing of Structures with Binary Sparse Distributed Codes," IEEE Transactions on Knowledge and Data Engineering, vol. 13, no. 2, pp. 261-276, March-April 2001, doi:10.1109/69.917565
Usage of this product signifies your acceptance of the Terms of Use.