CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2011 vol.33 Issue No.10 - October
Issue No.10 - October (2011 vol.33)
Peter Wittek , National University of Singapore, Singapore
Chew Lim Tan , National University of Singapore, Singapore
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2011.28
Wavelet kernels have been introduced for both support vector regression and classification. Most of these wavelet kernels do not use the inner product of the embedding space, but use wavelets in a similar fashion to radial basis function kernels. Wavelet analysis is typically carried out on data with a temporal or spatial relation between consecutive data points. We argue that it is possible to order the features of a general data set so that consecutive features are statistically related to each other, thus enabling us to interpret the vector representation of an object as a series of equally or randomly spaced observations of a hypothetical continuous signal. By approximating the signal with compactly supported basis functions and employing the inner product of the embedding L_2 space, we gain a new family of wavelet kernels. Empirical results show a clear advantage in favor of these kernels.
Wavelet kernels, feature engineering, feature correlation, semantic kernels.
Peter Wittek, Chew Lim Tan, "Compactly Supported Basis Functions as Support Vector Kernels for Classification", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 10, pp. 2039-2050, October 2011, doi:10.1109/TPAMI.2011.28