The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - February (1980 vol.2)
pp: 101-111
Jack Sklansky , SENIOR MEMBER, IEEE, Departments of Electrical Engineering, Information and Computer Science and Radiological Sciences, University of California, Irvine, CA 92717.
Leo Michelotti , Microelectronic Systems Division, Hughes Aircraft Company, Irvine, CA 92715.
ABSTRACT
We describe a versatile technique for designing computer algorithms for separating multiple-dimensional data (feature vectors) into two classes. We refer to these algorithms as classifiers. Our classifiers achieve nearly Bayes-minimum error rates while requiring relatively small amounts of memory. Our design procedure finds a set of close-opposed pairs of clusters of data. From these pairs the procedure generates a piecewise-linear approximation of the Bayes-optimum decision surface. A window training procedure on each linear segment of the approximation provides great flexibility of design over a wide range of class densities. The data consumed in the training of each segment are restricted to just those data lying near that segment, which makes possible the construction of efficient data bases for the training process. Interactive simplification of the classifier is facilitated by an adjacency matrix and an incidence matrix. The adjacency matrix describes the interrelationships of the linear segments {£i}. The incidence matrix describes the interrelationships among the polyhedrons formed by the hyperplanes containing {£i}. We exploit switching theory to minimize the decision logic.
CITATION
Jack Sklansky, Leo Michelotti, "Locally Trained Piecewise Linear Classifiers", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.2, no. 2, pp. 101-111, February 1980, doi:10.1109/TPAMI.1980.4766988
18 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool