Issue No. 06 - June (1994 vol. 16)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.295911
<p>This correspondence presents a method for evaluation of artificial neural network (ANN) classifiers. In order to find the performance of the network over all possible input ranges, a probabilistic input model is defined. The expected error of the output over this input range is taken as a measure of generalization ability. Two essential elements for carrying out the proposed evaluation technique are estimation of the input probability density and numerical integration. A nonparametric method, which depends on the nearest M neighbors, is used to locally estimate the distribution around each training pattern. An orthogonalization procedure is utilized to determine the covariance matrices of local densities. A Monte Carlo method is used to perform the numerical integration. The proposed evaluation technique has been used to investigate the generalization ability of back propagation (BP), radial basis function (RBF) and probabilistic neural network (PNN) classifiers for three test problems.</p>
pattern recognition; neural nets; generalisation (artificial intelligence); probability; integration; Monte Carlo methods; generalization ability; neural network classifiers; expected error; input probability density; numerical integration; covariance matrices; Monte Carlo method; back propagation; radial basis function; probabilistic neural network
K. Chan, D. Hummels, M. Musavi and K. Kalantri, "On the Generalization Ability of Neural Network Classifiers," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 16, no. , pp. 659-663, 1994.