Issue No. 01 - Jan. (2014 vol. 36)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2013.93
Anne Hendrikse , Signals & Syst. Group, Univ. of Twente, Overijsel, Netherlands
Raymond Veldhuis , Signals & Syst. Group, Univ. of Twente, Overijsel, Netherlands
Luuk Spreeuwers , Signals & Syst. Group, Univ. of Twente, Overijsel, Netherlands
The increase of the dimensionality of data sets often leads to problems during estimation, which are denoted as the curse of dimensionality. One of the problems of second-order statistics (SOS) estimation in high-dimensional data is that the resulting covariance matrices are not full rank, so their inversion, for example, needed in verification systems based on the likelihood ratio, is an ill-posed problem, known as the singularity problem. A classical solution to this problem is the projection of the data onto a lower dimensional subspace using principle component analysis (PCA) and it is assumed that any further estimation on this dimension-reduced data is free from the effects of the high dimensionality. Using theory on SOS estimation in high-dimensional spaces, we show that the solution with PCA is far from optimal in verification systems if the high dimensionality is the sole source of error. For moderate dimensionality, it is already outperformed by solutions based on euclidean distances and it breaks down completely if the dimensionality becomes very high. We propose a new method, the fixed-point eigenwise correction, which does not have these disadvantages and performs close to optimal.
Eigenvalues and eigenfunctions, Covariance matrices, Principal component analysis, Estimation, Training, Euclidean distance, Training data
A. Hendrikse, R. Veldhuis and L. Spreeuwers, "Likelihood-Ratio-Based Verification in High-Dimensional Spaces," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 36, no. 1, pp. 127-139, 2013.