This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A Novel Feature Recognition Neural Network and its Application to Character Recognition
January 1994 (vol. 16 no. 1)
pp. 98-106

Presents a feature recognition network for pattern recognition that learns the patterns by remembering their different segments. The base algorithm for this network is a Boolean net algorithm that the authors developed during past research. Simulation results show that the network can recognize patterns after significant noise, deformation, translation and even scaling. The network is compared to existing popular networks used for the same purpose, especially the Neocognitron. The network is also analyzed as regards to interconnection complexity and information storage/retrieval.

[1] R. P. Lippman, "An introduction to computing with neural nets,"IEEE ASSP Msg., vol. 4, pp. 4-22, 1987.
[2] J. J. Hopfield, "Neural networks and physical systems with emergent collective computational abilities," inProc. Nat. Acad. Sci., vol. 79, pp. 2554-2558, Apr. 1982.
[3] J. J. Hopfield and D. W. Tank, "Computing with neural circuits: A model,"Science, vol. 233, pp. 625-633, Aug. 1986.
[4] R. G. Gallager,Information Theory and Reliable Communication. New York: Wiley, 1972, p. 80.
[5] G. A. Carpenter and S. Grossberg, "Neural dynamics of category learning and recognition: Attention, memory consolidation and amnesia," inBrain Structure, Learning, and Memory(AAAS Symposium Series), J. Davis, R. Newburgh, and E. Wegman Eds., 1986.
[6] S. Grossberg,The Adaptive Brain, vol. I and II. Amsterdam: Elsevier, North-Holland, 1986.
[7] R. Rosenblatt, Principles of Neurodynamics. New York: Spartan, 1959.
[8] R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis. New York: Wiley, New York, 1973.
[9] T. Kohonen,Self-Organization and Associative Memory. Berlin, Germany: Springer-Verlag, 1988, p. 132.
[10] M. Minsky and S. Papert,Perceptrons: An Introduction to Computational Geometry. Cambridge, MA: MIT Press, 1969.
[11] R. M. Glorioso and F. C. Osorio,Engineering Intelligent Systems: Concepts, Theory&Applications. Bedford, MA: Digital, 1980, p. 318.
[12] D.E. Rumelhart and D. McClelland, eds.,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1-2, MIT Press, Cambridge, Mass., 1986.
[13] M. A. Arbib, "Neural computing,"J. Parallel and Distributed Processing, vol. 6, no. 2, pp. 185-216, Apr. 1989.
[14] R. C. Johnson and C. Brown,Cognizers: Neural Networks and Machines that Think. New York: Wiley, 1988.
[15] H. D. Block, B. W. Knight, and F. Rosenblatt, "Analysis of a four-layer series-coupled perceptron. II,"Rev. Modern Phys., vol. 34, pp. 135-152, Jan 1962.
[16] K. Fukushima and S. Miyake, "Neocognitron: A new algorithm for pattern recognition tolerant of deformations and shifts in positions,"Pattern Recognition, vol. 15, no. 6, pp. 455-469, 1982.
[17] K. Fukushima, S. Miyake, and T. Ito, "Neocognitron: A neural network model for a mechanism of visual pattern recognition,"IEEE Trans. Syst., Man, Cybern., vol. SMC-13, no. 5, pp. 826-834, Sept. 1983.
[18] S. Kollias and D. Anastassiou, "An adaptive least square algorithm for the efficient training of artificial neural networks,"IEEE Trans. Circuits Syst., vol. 36, no. 8, pp. 1092-1101, Aug. 1989.
[19] P. Werbos, "Building and understanding adaptive systems: A statistical/numerical approach to factory automation and brain research,"IEEE Trans. Syst., Man, Cybern., vol. SMC-17, Jan.-Feb. 1987.
[20] M. L. Brady, R. Raghavan, and J. Slawny, "Back-propagation fails to separate where perceptrons succeed,"IEEE Trans. Circuits Syst., vol. 36, pp. 665-674, 1989.
[21] D. E. Van Den Bout and T. K. Miller, "A digital architecture employing stochasticism for the simulation of Hopfield neural nets,"IEEE Trans. Circuits Syst., vol. 36, no. 5, pp. 732-746, May 1989.
[22] B. Hussain and M. Kabuka, "Neural net transformation of arbitrary Boolean functions,"SPIE's 1992 Int. Symp. Opt. Appl. Sci.: Neural and Stochastic Methods in Image and Signal Processing, San Diego, CA, July 1992.
[23] R. T. Chin, "Automated visual inspection, survey: 1981 to 1987,"Computer Vision, Graphics and Image Processing, vol. 41, pp. 346-381, 1988.
[24] J. Cadzow, "Recursive digital filters synthesis via gradient based algorithms,"IEEE Trans. Acoust., Speech. Signal Processing, vol. ASSP-24, pp. 349-355, 1976.
[25] P. J. Werbos, "Back-propagation: Past and future," inProc. IEEE Int. Conf. Neural Networks. New York: IEEE Press, 1988, pp. 343-353.
[26] I. Morashita and A. Yajima, "Analysis of simulation of networks of mutually inhibiting neutron,"Cybern., vol. 11, pp. 154-165, 1972.
[27] J. J. Hopfield, "Neurons with graded response have collective computational properties like those of two-state neurons,"Proc. Nat. Acad. Sci., vol. 81, no. 10, pp. 3088-3092, May 1984.
[28] N. El-Leithy, R. W. Newcomb, and M. Zaghloul, "A basic MOS neuraltype junction: A prospective on neural-type microsystems," inProc. 1987 IEEE ICNN, San Diego, CA, June 1987, pp. III-469-477.
[29] M. Habib, H. Akal, and R. Newcomb, "Logic gate formed neuron-type processing elements," inProc. 1988 Int. Symp. Circuits Syst., Helsinky, Finland, June 1988, pp. 491-494.
[30] J. Sklansky and G. N. Wassel,Pattern Classifiers and Trainable Machines. New York: Springer-Verlag, 1981.
[31] R. Winter and B. Widrow, "MADALINE RULE II: A training rule for neural networks," inProc. IEEE/INNS Joint Conf. Neural Networks(San Diego, CA), July 1988, pp. 1.401-1.408.
[32] B. Widrow and R. G. Winter, "Neural nets for adaptive filtering and adaptive pattern recognition,"IEEE Computer Mag., pp. 25-39, Mar. 1988.
[33] T. Schwartz, "Introduction to neural networks,"IEEE Video Conf. Neural Networks, Sept. 1989.
[34] S. Y. Kung, and J. N. Hwang, "A unified modeling of connectionist neural networks,"J. Parallel Distributed Comput., vol. 6, pp. 358-387, 1989.
[35] J. I. Minnix, E. S. McVey, and R. M. Iñigo, "Modified neocognitron with position normalizing preprocessor for translation invariant shape recognition,"Int. Joint Conf. Neural Nets, vol. 1, June 1990, pp. 395-399.
[36] C. Sung and D. Wilson, "Percognitron: Neocognitron coupled with perceptron,"Int. Joint Conf. Neural Nets, vol. 3, June 1990, pp. 753-758.
[37] G. Fahner, "A higher order unit that performs arbitrary Boolean functions,"Int. Joint Conf. Neural Nets, vol. 3, June 1990, pp. 193-197.
[38] D. H. Hubeland T. N. Wiesel, "Receptive fields, binocular interaction and functional architecture in cat's visual cortex,"J. Physiology, vol. 160, pp. 106-154, Jan. 1962.
[39] D. H. Huebel and T. N. Wiesel, "Receptive fields and functional architecture in two nonstriate visual area (18 and 19) of the cat,"J. Neurophysiology, vol. 28, pp. 229-289, 1965.
[40] D. Kirk and D. Voorhies, "The Rendering Architecture of the DN10000VS,"Computer Graphics(Proc. Siggraph), Vol. 24, No. 4, Aug. 1990, pp. 299-307.
[41] Y. S. Abu-Mostafa, "Connectivity versus entropy," inNeuron Information Processing Systems, D. Z. Anderson, Ed. New York: American Institute of Physics, 1988, pp. 1-8.

Index Terms:
character recognition; pattern recognition; neural nets; feature recognition neural network; character recognition; Boolean net algorithm; noise; deformation; translation; scaling; Neocognitron; interconnection complexity; information storage/retrieval
Citation:
B. Hussain, M.R. Kabuka, "A Novel Feature Recognition Neural Network and its Application to Character Recognition," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16, no. 1, pp. 98-106, Jan. 1994, doi:10.1109/34.273711
Usage of this product signifies your acceptance of the Terms of Use.