This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
The Relationship of the Bayes Risk to Certain Separability Measures in Normal Classification
February 1980 (vol. 2 no. 2)
pp. 97-100
Marvin Yablon, MEMBER, IEEE, Department of Mathematics, John Jay College of Criminal Justice, The City University of New York, New York, NY 10019.
J. T. Chu, Division of Management, Polytechnic Institute of New York, Brooklyn, NY 11201.
For the problem of classifying an element (e.g., an unknown pattern) into one of two given categories where the associated observables are distributed according to one of two known multivariate normal populations having a common covariance matrix, it is shown that the minimum Bayes risk is a strict monotonic function of certain separability or statistical distance measures regardless of the a priori probabilities and the assigned loss function. However, for the associated conditional expected losses, strict monotonicity holds, if and only if a certain condition dependent on these probabilities and the given loss function is satisfied. These results remain valid for classification problems in which the observable can be transformed by a one-to-one differentiable mapping to normality.
Citation:
Marvin Yablon, J. T. Chu, "The Relationship of the Bayes Risk to Certain Separability Measures in Normal Classification," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 2, no. 2, pp. 97-100, Feb. 1980, doi:10.1109/TPAMI.1980.4766987
Usage of this product signifies your acceptance of the Terms of Use.