
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
T.G. Clarkson, D. Gorse, J.G. Taylor, C.K. Ng, "Learning Probabilistic RAM Nets Using VLSI Structures," IEEE Transactions on Computers, vol. 41, no. 12, pp. 15521561, December, 1992.  
BibTex  x  
@article{ 10.1109/12.214663, author = {T.G. Clarkson and D. Gorse and J.G. Taylor and C.K. Ng}, title = {Learning Probabilistic RAM Nets Using VLSI Structures}, journal ={IEEE Transactions on Computers}, volume = {41}, number = {12}, issn = {00189340}, year = {1992}, pages = {15521561}, doi = {http://doi.ieeecomputersociety.org/10.1109/12.214663}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Computers TI  Learning Probabilistic RAM Nets Using VLSI Structures IS  12 SN  00189340 SP1552 EP1561 EPD  15521561 A1  T.G. Clarkson, A1  D. Gorse, A1  J.G. Taylor, A1  C.K. Ng, PY  1992 KW  synaptic noise; global rewards; global penalties; local penalties; RAM nets; VLSI structures; learning probabilistic RAMs; local reinforcement rules; stochastic search; local rewards; backpropagation; serial updating; weights; learning rule; backpropagation; contentaddressable storage; neural nets; VLSI. VL  41 JA  IEEE Transactions on Computers ER   
Hardwarerealizable learning probabilistic RAMs (pRAMs) which implement local reinforcement rules utilizing synaptic rather than threshold noise in the stochastic search procedure are described. The design allows for both global and local rewards and penalties (in this latter case implementing a modified version of backpropagation). The architecture allows for serial updating of the weights of a pRAM net according to a reward/penalty learning rule. It is possible to generate a new set of pRAM outputs at least every 100 mu s, which is faster than the response time of biological neurons.
[1] T. G. Clarkson, D. Gorse, and J. G. Taylor, "Hardwarerealisable models of neural processing," inProc IEE Int. Conf. Artificial Neural Networks, London, 1989, pp. 242246.
[2] D. Gorse and J. G. Taylor, "A general model of stochastic neural processing,"Biol. Cybern., vol. 63, pp. 299306, 1990.
[3] D. Gorse and J. G. Taylor, "Universal associative stochastic learning automata,"Neural Network World, vol. 1, pp. 192202, 1991.
[4] D. Gorse and J. G. Taylor, "A continuous input RAMbased stochastic neural model,"Neural Networks, vol. 4, pp. 657666, 1991.
[5] T. G. Clarkson, D. Gorse, and J. G. Taylor, "From wetware to hardware: Reverse engineering using probabilistic RAMs," to appear in a Special Issue: "Recent Advances in Neural Nets",J. Intell. Syst., vol. 2, pp. 1130, 1992.
[6] H. Eguchi, T. Faruta, H. Hariguchi, S. Oteki, and T. Kitaguchi, "Neural network LSI chip with onchip learning," inProc. IJCNN. '91, vol. I, Seattle, WA, 1991, pp. 453456.
[7] C. Schneider and H. C. Card, "CMOS implementation of analog Hebbian synaptic learning circuits," inProc. IJCNN. '91, vol. I, Seattle, WA, 1991, pp. 437442.
[8] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning internal representation by error propagation,"Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vols. 1 and 2. Cambridge, MA: MIT Press, 1986.
[9] D. Gorse and J. G. Taylor, "An analysis of noisy RAM and neural nets,"Physica D, vol. 34, pp. 90114, 1989.
[10] D. E. Rumelhartet al., Parallel Distributed Processing: Explorations in the Microstructures of Computing. Cambridge, MA: MIT Press, 1986.
[11] A. G. Barto and P. Anandan, "Pattern recognizing stochastic learning automata,"IEEE Trans. Syst., Man, Cybern., vol. 15, pp. 360375, 1985.
[12] A. G. Barto, R. S. Sutton, and C. W. Anderson, "Neuronlike adaptive elements that can solve difficult learning control problems,"IEEE Trans. Syst., Man, Cybern., vol. 13, pp. 834846, 1983.
[13] J. G. Taylor, "Spontaneous behavior in neural networks,"J. Theor. Biol., vol. 36, pp. 513528, 1972.
[14] P. C. Bressloff and J. G. Taylor, "Random iterative networks,"Phys. Rev. A, vol. 41, pp. 11261137, 1990.
[15] T. G. Clarkson, J. G. Taylor, and D. Gorse, "pRAM automata," inProc. IEEE Int. Workshop Cellular Neural Networks and their Appl. (CNNA. '90), Budapest, 1990, pp. 235243.
[16] P. Y. Allaet al., "Silicon integration of learning algorithms and other autoadaptive properties in a digital feedback neural network," inVLSI Design of Neural Networks, Ramacher and Rückert, Eds. Boston, MA: Kluwer, 1991, pp. 174175.
[17] D. Gorse and J. G. Taylor, "Hardwarerealisable learning algorithms,"Proc. INNC90, Paris, Dordrecht, Kluwer, 1990, pp. 821824.
[18] G. E. Hinton and T. J. Sejnowski, "Learning and relearning in Boltzmann machines," inPARALLEL DISTRIBUTED PROCESSING: Explorations in the Microstructure of Cognition, Vol. 1: Foundations, pp. 282317.
[19] D. Gorse and J. G. Taylor, "Learning sequential structure with recurrent pRAM nets,"Proc. IJCNN '91, vol. II, Seattle, WA, 1991, pp. 3742.
[20] D. ServanSchreiber, A. Cleeremans, and J. L. McClelland, "Encoding sequential structure in simple recurrent networks," Carnegie Mellon Univ. Tech. Rep. CMUCS88183, Nov. 1988.
[21] C. L. Giles, G. Z. Sun, H. H. Chen, Y. C. Lee, and D. Chen, "Higher order recurrent networks and grammatical inference," inAdvances in Neural Information Processing Systems 2, D. S. Touretzky Ed. San Mateo, CA: Morgan Kauffman, 1990, pp. 380387.
[22] A. H. Klopf,The Hedonistic Neuron: A Theory of Memory, Learning and Intelligence. Washington, DC: Hemisphere, 1982.
[23] D. E. Koshland, "Bacterial chemotaxis in relation to neurobiology,"Ann. Rev. Neurosci., vol. 3, pp. 4375, 1980.
[24] K. S. Narendra and M. A. L. Thathachar, "Learning automataA survey,"IEEE. Trans. Syst., Man., Cybern., vol. 4, pp. 323334, 1974.
[25] B. Katz,Nerve, Muscle and Synapse. New York: McGrawHill, 1966.
[26] A. G. Barto and R. S. Sutton, "Landmark learning: An illustration of associative search,"Biol. Cybern., vol. 42, pp. 18, 1981.
[27] C. E. Myers, "Reinforcement training when results are delayed and interleaved in time," inProc INNC90Paris, 1990, pp. 860863.
[28] S. Joneset al., "Toroidal neural network: Architecture and processor granularity issues," inVLSI Design of Neural Networks, Ramacher and Rückert, Eds. Boston, MA: Kluwer, 1991, pp. 174175.
[29] U. Ramacheret al., "Design of a 1st generation neurocomputer," inVLSI Design of Neural Networks, Ramacher and Rückert, Eds. Boston, MA: Kluwer, 1991, pp. 174175.
[30] F. M. A. Salam and Y. Wang, "A realtime experiment using a 50 neuron CMOS analog silicon chip with onchip digital learning,"IEEE Trans. Neural Networks, vol. 2, no. 4, pp. 461464, 1991.
[31] T. G. Clarkson, C. K. Ng, D. Gorse, and J. G. Taylor, "A serial update VLSI architecture for the learning probabilistic RAM neuron," inProc. ICANN91, Helsinki, 1991, pp. 15731576.