CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2013 vol.35 Issue No.11 - Nov.
Issue No.11 - Nov. (2013 vol.35)
S. Melacci , Dept. of Inf. Eng. & Math. Sci., Univ. of Siena, Siena, Italy
M. Gori , Dept. of Inf. Eng. & Math. Sci., Univ. of Siena, Siena, Italy
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2013.73
Supervised examples and prior knowledge on regions of the input space have been profitably integrated in kernel machines to improve the performance of classifiers in different real-world contexts. The proposed solutions, which rely on the unified supervision of points and sets, have been mostly based on specific optimization schemes in which, as usual, the kernel function operates on points only. In this paper, arguments from variational calculus are used to support the choice of a special class of kernels, referred to as box kernels, which emerges directly from the choice of the kernel function associated with a regularization operator. It is proven that there is no need to search for kernels to incorporate the structure deriving from the supervision of regions of the input space, because the optimal kernel arises as a consequence of the chosen regularization operator. Although most of the given results hold for sets, we focus attention on boxes, whose labeling is associated with their propositional description. Based on different assumptions, some representer theorems are given that dictate the structure of the solution in terms of box kernel expansion. Successful results are given for problems of medical diagnosis, image, and text categorization.
Kernel, Green's function methods, Support vector machines, Optimization, Materials, Probability distribution, Context,regularization operators, Box kernels, Green's functions, kernel machines, propositional rules
S. Melacci, M. Gori, "Learning with Box Kernels", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.35, no. 11, pp. 2680-2692, Nov. 2013, doi:10.1109/TPAMI.2013.73