
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
D. Wang, B. Schurmann, "Computer Aided Analysis and Derivation for Artificial Neural Systems," IEEE Transactions on Software Engineering, vol. 18, no. 8, pp. 728735, August, 1992.  
BibTex  x  
@article{ 10.1109/32.153382, author = {D. Wang and B. Schurmann}, title = {Computer Aided Analysis and Derivation for Artificial Neural Systems}, journal ={IEEE Transactions on Software Engineering}, volume = {18}, number = {8}, issn = {00985589}, year = {1992}, pages = {728735}, doi = {http://doi.ieeecomputersociety.org/10.1109/32.153382}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Software Engineering TI  Computer Aided Analysis and Derivation for Artificial Neural Systems IS  8 SN  00985589 SP728 EP735 EPD  728735 A1  D. Wang, A1  B. Schurmann, PY  1992 KW  computer aided analysis and derivation; symbolic mathematical objects manipulation; artificial neural systems; biological knowledge; symbolic computation; Lyapunov stability theory; toolkit; MACSYMA; learning rule; artificial intelligence; computer aided analysis; Lyapunov methods; mathematics computing; neural nets; symbol manipulation VL  18 JA  IEEE Transactions on Software Engineering ER   
The theoretical analysis and derivation of artificial neural systems, which consists essentially of manipulating symbolic mathematical objects according to certain mathematical and biological knowledge, can be done more efficiently with computer assistance by using and extending methods and systems of symbolic computation. After presenting the mathematical characteristics of neural systems and a brief review on Lyapunov stability theory, the authors present some features and capabilities of existing systems and the extension for manipulating objects occurring in the analysis of neural systems. Some strategies and a toolkit developed in MACSYMA for computeraided analysis and derivation are described. A concrete example is given to demonstrate the derivation of a hybrid neural system, i.e. a system which in its learning rule combines elements of supervised and unsupervised learning. Future work and research directions are indicated.
[1] S. Amari and M. A. Arbib, Eds.,Competition and Cooperation in Neural Networks. New York: SpringerVerlag, 1982.
[2] A. F. Atiya, "Learning on a general network," inProc. IEEE Conf. Neural Information Processing Systems, 1987.
[3] M. A. Cohen and S. Grossberg, "Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,"IEEE Trans. Syst., Man, Cybern., vol. SMC13, pp. 815826, 1983.
[4] C. L. Giles, R. D. Griffin, and T. Maxwell, "Encoding geometric invariances in higher order neural networks,"Neural Information Processing Systems(D. Z. Anderson, Ed.) New York: Amer. Inst. Phys., pp. 301309, 1988.
[5] B. S. Goh and T. T. Agnew, "Stability in Gilpin and Ayala's models of competition,"J. Math. Bio., vol. 4, pp. 275279, 1977.
[6] S. Grossberg,Studies of Mind and Brain: Neural Principles of Learning, Perception, Development, Cognition, and Motor Control. Boston: Reidel Press, 1982.
[7] S. Grossberg, "Nonlinear neural networks: Principles, mechanisms, and architectures,"Neural Networks, vol. 1, pp. 1761, 1988.
[8] J. J. Hopfield, "Neurons with graded response have collective computational properties like those of twostate neurons,"Proc. Natl. Acad. Sci. USA, vol. 81, pp. 30883092, 1984.
[9] J. P. LaSalle, "An invariance principle in the theory of stability,"Differential Equations and Dynamical Systems(J. K. Hale and J. P. LaSalle, Eds.). New YorkLondon: Academic Press, pp. 277286, 1967.
[10] D. S. Levine, "Neural population modeling and psychology: A review,"Mathematical Biosciences, vol. 66, pp. 186, 1983.
[11] S. Lefschetz,Differential Equations: Geometric Theory. New YorkLondon: Interscience Publishers, 1962.
[12] VAX UNIX MACSYMATMReference Manual, Version 11, Symbolics, Inc., 1985.
[13] A. N. Michel, J. A. Farrel, and W. Porod, "Stability results for neural networks,"Neural Information Processing Systems(D. Z. Anderson, Ed.). New York: Amer. Inst. Phys., pp. 554563, 1988.
[14] B. A. Pearlmutter, "Learning state space trajectories in recurrent neural networks,"Neural Computation, vol. 1, pp. 263269, 1989.
[15] U. Ramacher and B. Schürmann, "Unified description of neural algorithms for time independent pattern recognition,"VLSI Design of Neural Networks(U. Ramacher and U. Rückert, Eds.) Kluwer, pp. 255270, 1990.
[16] D.E. Rumelhart and D. McClelland, eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 12, MIT Press, Cambridge, Mass., 1986.
[17] B. Schürmann, "Stability and adaptation in artificial neural systems,"Phys. Rev., vol. A 40, pp. 26812688, 1989.
[18] B. Schürmann, J. Hollatz, and U. Ramacher, "Adaptive recurrent neural networks and dynamic stability,"XI Sitges Conference: Neural Networks(L. Garrido, Ed.). New YorkHeidelberg: SpringerVerlag, pp. 4963, 1990.
[19] B. Schürmann and D. Wang, "Stable neurodynamics and symbolic computation," inProc. INNC90, July 1990, p. 1013;RISCLinz Seriesno. 9038.0, Johannes Kepler University, Austria.
[20] D. Wang, "Differentiation and integration of indefinite summations with respect to indexed variables,"RISCLinz Seriesno. 9037.0, Johannes Kepler University, Austria.
[21] D. Wang, "A toolkit for manipulating indefinite summations with application to neural networks," inProc. ISSAC'91, July 1991, pp. 462463:ACM SIGSAM Bulletin, vol. 25/3, pp. 1827, 1991.
[22] A. S. Weigend, B. A. Huberman, and D. E. Rumelhart, "Predicting the future: A connectionist approach,"Preprintno. StanfordPDP9001, Stanford Univ., Apr. 1990.