The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.05 - May (1984 vol.6)
pp: 601-616
Salvatore D. Morgera , Department of Electrical Engineering, Concordia University, Montreal, P.Q., Canada.
Lokesh Datta , Department of Electrical Engineering, Concordia University, Montreal, P.Q., Canada.
ABSTRACT
Several authors have studied the problem of dimensionality reduction or feature selection using statistical distance measures, e.g., the Chernoff coefficient, Bhattacharyya distance, I-divergence, and J-divergence because they generally felt that direct use of the probability of classification error expression was either computationally or mathematically intractable. We show that for the difficult problem of testing one weakly stationary Gaussian stochastic process against another when the mean vectors are similar and the covariance matrices (patterns) differ, the probability of error expression may be dealt with directly using a combination of classical methods and distribution function theory. The results offer a new and accurate finite dimensionality information-theoretic strategy to feature selection, and are shown, by use of examples, to be superior to the well-known Kadota-Shepp approach which employs distance measures and asymptotics in its formulation. The present Part I deals with the theory; Part II deals with the implementation of a computer-based real-time pattern classifier which takes into account a realistic quasi-stationarity of the patterns.
CITATION
Salvatore D. Morgera, Lokesh Datta, "Toward a Fundamental Theory of Optimal Feature Selection: Part I", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.6, no. 5, pp. 601-616, May 1984, doi:10.1109/TPAMI.1984.4767573
5 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool