From Sample Similarity to Ensemble Similarity: Probabilistic Distance Measures in Reproducing Kernel Hilbert Space
Issue No. 06 - June (2006 vol. 28)
Shaohua Kevin Zhou , IEEE
Rama Chellappa , IEEE
This paper addresses the problem of characterizing ensemble similarity from sample similarity in a principled manner. Using reproducing kernel as a characterization of sample similarity, we suggest a probabilistic distance measure in the reproducing kernel Hilbert space (RKHS) as the ensemble similarity. Assuming normality in the RKHS, we derive analytic expressions for probabilistic distance measures that are commonly used in many applications, such as Chernoff distance (or the Bhattacharyya distance as its special case), Kullback-Leibler divergence, etc. Since the reproducing kernel implicitly embeds a nonlinear mapping, our approach presents a new way to study these distances whose feasibility and efficiency is demonstrated using experiments with synthetic and real examples. Further, we extend the ensemble similarity to the reproducing kernel for ensemble and study the ensemble similarity for more general data representations.
Ensemble similarity, kernel methods, Chernoff distance, Bhattacharyya distance, Kullback-Leibler (KL) divergence/relative entropy, Patrick-Fisher distance, Mahalonobis distance, reproducing kernel Hilbert space.
S. K. Zhou and R. Chellappa, "From Sample Similarity to Ensemble Similarity: Probabilistic Distance Measures in Reproducing Kernel Hilbert Space," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 28, no. , pp. 917-929, 2006.