CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2013 vol.35 Issue No.09 - Sept.

Subscribe

Issue No.09 - Sept. (2013 vol.35)

pp: 2051-2063

Qi Mao , Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore

I. W-H Tsang , Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore

ABSTRACT

Feature selection with specific multivariate performance measures is the key to the success of many applications such as image retrieval and text classification. The existing feature selection methods are usually designed for classification error. In this paper, we propose a generalized sparse regularizer. Based on the proposed regularizer, we present a unified feature selection framework for general loss functions. In particular, we study the novel feature selection paradigm by optimizing multivariate performance measures. The resultant formulation is a challenging problem for high-dimensional data. Hence, a two-layer cutting plane algorithm is proposed to solve this problem, and the convergence is presented. In addition, we adapt the proposed method to optimize multivariate measures for multiple-instance learning problems. The analyses by comparing with the state-of-the-art feature selection methods show that the proposed method is superior to others. Extensive experiments on large-scale and high-dimensional real-world datasets show that the proposed method outperforms l

_{1}-SVM and SVM-RFE when choosing a small subset of features, and achieves significantly improved performances over SVM^{perl}in terms of F_{1}-score.INDEX TERMS

Loss measurement, Vectors, Support vector machines, Kernel, Convergence, Error analysis, Optimization,structural SVMs, Feature selection, performance measure, multiple kernel learning, multi-instance learning

CITATION

Qi Mao, I. W-H Tsang, "A Feature Selection Method for Multivariate Performance Measures",

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol.35, no. 9, pp. 2051-2063, Sept. 2013, doi:10.1109/TPAMI.2012.266REFERENCES