This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Computer Aided Analysis and Derivation for Artificial Neural Systems
August 1992 (vol. 18 no. 8)
pp. 728-735

The theoretical analysis and derivation of artificial neural systems, which consists essentially of manipulating symbolic mathematical objects according to certain mathematical and biological knowledge, can be done more efficiently with computer assistance by using and extending methods and systems of symbolic computation. After presenting the mathematical characteristics of neural systems and a brief review on Lyapunov stability theory, the authors present some features and capabilities of existing systems and the extension for manipulating objects occurring in the analysis of neural systems. Some strategies and a toolkit developed in MACSYMA for computer-aided analysis and derivation are described. A concrete example is given to demonstrate the derivation of a hybrid neural system, i.e. a system which in its learning rule combines elements of supervised and unsupervised learning. Future work and research directions are indicated.

[1] S. Amari and M. A. Arbib, Eds.,Competition and Cooperation in Neural Networks. New York: Springer-Verlag, 1982.
[2] A. F. Atiya, "Learning on a general network," inProc. IEEE Conf. Neural Information Processing Systems, 1987.
[3] M. A. Cohen and S. Grossberg, "Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,"IEEE Trans. Syst., Man, Cybern., vol. SMC-13, pp. 815-826, 1983.
[4] C. L. Giles, R. D. Griffin, and T. Maxwell, "Encoding geometric invariances in higher order neural networks,"Neural Information Processing Systems(D. Z. Anderson, Ed.) New York: Amer. Inst. Phys., pp. 301-309, 1988.
[5] B. S. Goh and T. T. Agnew, "Stability in Gilpin and Ayala's models of competition,"J. Math. Bio., vol. 4, pp. 275-279, 1977.
[6] S. Grossberg,Studies of Mind and Brain: Neural Principles of Learning, Perception, Development, Cognition, and Motor Control. Boston: Reidel Press, 1982.
[7] S. Grossberg, "Nonlinear neural networks: Principles, mechanisms, and architectures,"Neural Networks, vol. 1, pp. 17-61, 1988.
[8] J. J. Hopfield, "Neurons with graded response have collective computational properties like those of two-state neurons,"Proc. Natl. Acad. Sci. USA, vol. 81, pp. 3088-3092, 1984.
[9] J. P. LaSalle, "An invariance principle in the theory of stability,"Differential Equations and Dynamical Systems(J. K. Hale and J. P. LaSalle, Eds.). New York-London: Academic Press, pp. 277-286, 1967.
[10] D. S. Levine, "Neural population modeling and psychology: A review,"Mathematical Biosciences, vol. 66, pp. 1-86, 1983.
[11] S. Lefschetz,Differential Equations: Geometric Theory. New York-London: Interscience Publishers, 1962.
[12] VAX UNIX MACSYMATMReference Manual, Version 11, Symbolics, Inc., 1985.
[13] A. N. Michel, J. A. Farrel, and W. Porod, "Stability results for neural networks,"Neural Information Processing Systems(D. Z. Anderson, Ed.). New York: Amer. Inst. Phys., pp. 554-563, 1988.
[14] B. A. Pearlmutter, "Learning state space trajectories in recurrent neural networks,"Neural Computation, vol. 1, pp. 263-269, 1989.
[15] U. Ramacher and B. Schürmann, "Unified description of neural algorithms for time independent pattern recognition,"VLSI Design of Neural Networks(U. Ramacher and U. Rückert, Eds.) Kluwer, pp. 255-270, 1990.
[16] D.E. Rumelhart and D. McClelland, eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1-2, MIT Press, Cambridge, Mass., 1986.
[17] B. Schürmann, "Stability and adaptation in artificial neural systems,"Phys. Rev., vol. A 40, pp. 2681-2688, 1989.
[18] B. Schürmann, J. Hollatz, and U. Ramacher, "Adaptive recurrent neural networks and dynamic stability,"XI Sitges Conference: Neural Networks(L. Garrido, Ed.). New York-Heidelberg: Springer-Verlag, pp. 49-63, 1990.
[19] B. Schürmann and D. Wang, "Stable neurodynamics and symbolic computation," inProc. INNC-90, July 1990, p. 1013;RISC-Linz Seriesno. 90-38.0, Johannes Kepler University, Austria.
[20] D. Wang, "Differentiation and integration of indefinite summations with respect to indexed variables,"RISC-Linz Seriesno. 90-37.0, Johannes Kepler University, Austria.
[21] D. Wang, "A toolkit for manipulating indefinite summations with application to neural networks," inProc. ISSAC'91, July 1991, pp. 462-463:ACM SIGSAM Bulletin, vol. 25/3, pp. 18-27, 1991.
[22] A. S. Weigend, B. A. Huberman, and D. E. Rumelhart, "Predicting the future: A connectionist approach,"Preprintno. Stanford-PDP-90-01, Stanford Univ., Apr. 1990.

Index Terms:
computer aided analysis and derivation; symbolic mathematical objects manipulation; artificial neural systems; biological knowledge; symbolic computation; Lyapunov stability theory; toolkit; MACSYMA; learning rule; artificial intelligence; computer aided analysis; Lyapunov methods; mathematics computing; neural nets; symbol manipulation
Citation:
D. Wang, B. Schurmann, "Computer Aided Analysis and Derivation for Artificial Neural Systems," IEEE Transactions on Software Engineering, vol. 18, no. 8, pp. 728-735, Aug. 1992, doi:10.1109/32.153382
Usage of this product signifies your acceptance of the Terms of Use.