Issue No. 04 - April (2009 vol. 21)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TKDE.2008.178
Chih-Ming Hsu , National Taiwan University, Taipei
Ming-Syan Chen , National Taiwan University, Taipei
Effective distance functions in high dimensional data space are very important in solutions for many data mining problems. Recent research has shown that if the Pearson variation of the distance distribution converges to zero with increasing dimensionality, the distance function will become unstable (or meaningless) in high dimensional space, even with the commonly used $L_p$ metric in the Euclidean space. This result has spawned many studies the along the same lines. However, the necessary condition for unstability of a distance function, which is required for function design, remains unknown. In this paper, we shall prove that several important conditions are in fact equivalent to unstability. Based on these theoretical results, we employ some effective and valid indices for testing the stability of a distance function. In addition, this theoretical analysis inspires us that unstable phenomena are rooted in variation of the distance distribution. To demonstrate the theoretical results, we design a meaningful distance function, called the Shrinkage-Divergence Proximity (SDP), based on a given distance function. It is shown empirically that the SDP significantly outperforms other measures in terms of stability in high dimensional data space, and is thus more suitable for distance-based clustering applications.
Data mining, Feature extraction or construction, Clustering
C. Hsu and M. Chen, "On the Design and Applicability of Distance Functions in High-Dimensional Data Space," in IEEE Transactions on Knowledge & Data Engineering, vol. 21, no. , pp. 523-536, 2008.