Issue No. 04 - April (2000 vol. 22)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.845383
<p><b>Abstract</b>—This paper describes a character recognition methodology (henceforth referred to as <it>Hierarchical OCR</it>) that achieves high speed and accuracy by using a multiresolution and hierarchical feature space. Features at different resolutions, from coarse to fine-grained, are implemented by means of a recursive classification scheme. Typically, recognizers have to balance the use of features at many resolutions (which yields a high accuracy), with the burden on computational resources in terms of storage space and processing time. We present in this paper, a method that adaptively determines the degree of resolution necessary in order to classify an input pattern. This leads to optimal use of computational resources. The <it>Hierarchical OCR</it> dynamically adapts to factors such as the quality of the input pattern, its intrinsic similarities and differences from patterns of other classes it is being compared against, and the processing time available. Furthermore, the finer resolution is accorded to only certain “zones” of the input pattern which are deemed important given the classes that are being discriminated. Experimental results support the methodology presented. When tested on standard NIST data sets, the <it>Hierarchical OCR</it> proves to be 300 times faster than a traditional K-nearest-neighbor classification method, and 10 times faster than a neural network method. The comparsion uses the same feature set for all methods. Recognition rate of about 96 percent is achieved by the <it>Hierarchical OCR</it>. This is at par with the other two traditional methods.</p>
Pattern recognition, character/digit recognition, multiresolution, feature space, hierarchical classification, recursion.
Sargur N. Srihari, Venu Govindaraju, Jaehwa Park, "OCR in a Hierarchical Feature Space", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 22, no. , pp. 400-407, April 2000, doi:10.1109/34.845383