The Community for Technology Leaders
Green Image
<p><b>Abstract</b>—The schemes for compositional distributed representations include those allowing on-the-fly construction of fixed dimensionality codevectors to encode structures of various complexity. Similarity of such codevectors takes into account both structural and semantic similarity of represented structures. In this paper, we provide a comparative description of sparse binary distributed representation developed in the framework of the associative-projective neural network architecture and the more well-known holographic reduced representations of Plate and binary spatter codes of Kanerva. The key procedure in associative-projective neural networks is context-dependent thinning which binds codevectors and maintains their sparseness. The codevectors are stored in structured memory array which can be realized as distributed auto-associative memory. Examples of distributed representation of structured data are given. Fast estimation of the similarity of analogical episodes by the overlap of their codevectors is used in the modeling of analogical reasoning both for retrieval of analogs from memory and for analogical mapping.</p>
Sparse coding, binary coding, binding, representation of structure, hierarchical representation, nested representation, long-term memory, analogy, compositional distributed representations, connectionist symbol processing, analogical retrieval, analogical mapping.

D. A. Rachkovskij, "Representation and Processing of Structures with Binary Sparse Distributed Codes," in IEEE Transactions on Knowledge & Data Engineering, vol. 13, no. , pp. 261-276, 2001.
170 ms
(Ver 3.3 (11022016))