This Article 
 Bibliographic References 
 Add to: 
A General Framework for Concurrent Simulation on Neural Network Models
July 1992 (vol. 18 no. 7)
pp. 551-562

The analysis of complex neural network models via analytical techniques is often quite difficult due to the large numbers of components involved and the nonlinearities associated with these components. The authors present a framework for simulating neural networks as discrete event nonlinear dynamical systems. This includes neural network models whose components are described by continuous-time differential equations or by discrete-time difference equations. Specifically, the authors consider the design and construction of a concurrent object-oriented discrete event simulation environment for neural networks. The use of an object-oriented language provides the data abstraction facilities necessary to support modification and extension of the simulation system at a high level of abstraction. Furthermore, the ability to specify concurrent processing supports execution on parallel architectures. The use of this system is demonstrated by simulating a specific neural network model on a general-purpose parallel computer.

[1] DARPA Neural Network Study, B. Widrow, Study Director, Fairfax, VA: AFCEA International Press, 1988.
[2] J. Alspector and D. Hammerstrom, "Electronic and optical implementations sessions," inProc. Int. Joint Conf. Neural Networks, vol. 1, pp. 415-592, 1991.
[3] R. P. Lippmannet al., Advances in Neural Information Processing Systems 3, San Mateo, CA: Morgan Kaufmann, 1991, pp. 993-1052.
[4] C. A. Mead,Analog VLSI and Neural Systems. Reading, MA: Addison-Wesley, 1989.
[5] E. Sánchez-Sinencio, Ed.,IEEE Trans. Neural Networks, Special Issue on Neural Network Hardware, vol. 2, pp. 192-251, 1991.
[6] G. L. Heilemanet al., "A neural net associative memory for real-time applications,"Neural Computation, vol. 2, pp. 107-115, 1990.
[7] D. A. Pomerleau, G. L. Gusciora, D. S. Touretzky, and H. T. Kung, "Neural network simulation at warp speed: How we got 17 million connections per second," inProc. Int. Conf. Neural Networks, San Diego, CA, June 1988.
[8] X. Zhanget al., "An efficient implementation of the back-propagation algorithm on the connection machine CM-2, inAdvances in Neural Information Processing Systems 2, San Mateo, CA: Morgan Kaufmann, 1990, pp. 801-809.
[9] N. Gehani and W. D. Roome, "Concurrent C++: Concurrent programming with class(es)," inSoftware--Practice&Experience, vol. 18, pp. 1157-1117, 1989.
[10] S. B. Lippman,C++Primer, 2nd ed. Reading, MA: Addison-Wesley, 1991.
[11] B. Stroustrup,The C++ Programming Language. Reading MA: Addison-Wesley, 1987.
[12] N. Gehani and W. D. Roome,The Concurrent C Programming Language, Summit, NJ: Silicon Press, 1989.
[13] B. Meyers,Object Oriented Software Construction. Englewood Cliffs, NJ: Prentice-Hall, 1988.
[14] C.A.R. Hoare,Communicating Sequential Processes, Prentice Hall, Englewood, N.J., 1985.
[15] P. B. Hansen, "Distributed processes: A concurrent programming concept,"Commun. ACM, vol. 21, no. 11, pp. 934-941, Nov. 1987.
[16] G. Agha, "Concurrent object-oriented programming,"Communications of the ACM, vol. 33, pp. 125-141, 1990.
[17] S. Grossberg, "Nonlinear neural networks: Principles, mechanisms, and architectures,"Neural Networks, vol. 1, pp. 17-61, 1988.
[18] F. J. Pineda, "Dynamics and architecture for neural computation,"J. Complexity, vol. 4, pp. 216-245, 1988.
[19] R. Righter and J. C. Walrand, "Distributed simulation of discrete event systems,"Proc. IEEE, vol. 77, pp. 99-113, 1989.
[20] J. J. Hopfield, "Neural networks and physical systems with emergent collective computational abilities," inProc. National Academy of Science USA, vol. 79, pp. 2554-2558, 1982.
[21] W. D. Roome, "The CTK: An efficient multi-processor kernel," AT&T Bell Laboratories, 1986.
[22] G. A. Carpenter and S. Grossberg, "A massively parallel architecture for a self-organizing neural pattern recognition machine,"Computer Vision, Graphics, and Image Processing, vol. 37, pp. 54-115, 1987.
[23] S. Grossberg, "Adaptive pattern recognition and universal recoding II: Feedback, expectation, olfaction, and illusions,"Biological Cybernetics, vol. 23, pp. 187-202, 1976.
[24] G. L. Heileman and M. Georgiopoulos, "The augmented ART1 network," inProc. Int. Joint Conf. Neural Networks, pp. 467-472, 1991.
[25] G. L. Heileman and M. Georgiopoulos, "A real-time representation of the ART1 network,"rep. no. EECE 91-001, Univ. New Mexico, Jan. 1991.
[26] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning internal representation by error propagation,"Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vols. 1 and 2. Cambridge, MA: MIT Press, 1986.
[27] L. B. Almeida, "A learning rule for asynchronous perceptrons with feedback in a combinatorial environment," inProc. IEEE Int. Conf. Neural Networks, vol. II, pp. 609-618, 1987.
[28] R. Hecht-Nielsen, "Theory of backpropagation neural network," inProc. IEEE-IJCNN89(Washinghton DC), 1989, pp. 593-605, vol. I.

Index Terms:
concurrent simulation; neural network models; nonlinearities; discrete event nonlinear dynamical systems; continuous-time differential equations; discrete-time difference equations; concurrent object-oriented discrete event simulation; object-oriented language; data abstraction; parallel architectures; general-purpose parallel computer; data structures; discrete event simulation; neural nets; object-oriented programming; parallel languages
G.L. Heileman, M. Georgiopoulos, W.D. Roome, "A General Framework for Concurrent Simulation on Neural Network Models," IEEE Transactions on Software Engineering, vol. 18, no. 7, pp. 551-562, July 1992, doi:10.1109/32.148474
Usage of this product signifies your acceptance of the Terms of Use.