The Community for Technology Leaders
2017 IEEE International Conference on Multimedia and Expo (ICME) (2017)
Hong Kong, Hong Kong
July 10, 2017 to July 14, 2017
ISSN: 1945-788X
ISBN: 978-1-5090-6068-9
pp: 853-858
Ziwei Yu , School of Computer Science and Technology, Tianjin University, Tianjin 300350, China
Changqing Zhang , School of Computer Science and Technology, Tianjin University, Tianjin 300350, China
Qinghua Hu , School of Computer Science and Technology, Tianjin University, Tianjin 300350, China
Pengfei Zhu , School of Computer Science and Technology, Tianjin University, Tianjin 300350, China
ABSTRACT
In this paper, we focus on promoting multi-label learning task with ensemble learning. Compared to traditional single algorithm methods, it has been recognized that ensemble methods could achieve much better performance than each constituent learned model, especially under the conditional independence of different classifiers. Existing multi-label ensemble algorithms mainly focus on creating diverse component learners by employing different mechanisms, mostly using randomization strategies by smart heuristics. Different from most existing methods, in this paper, we propose an ensemble method to learn the basic classifiers which considers the general independence of the different classifiers. Therefore, each learned multi-label classifier is guaranteed to be diverse and complementary. Furthermore, considering the different qualities of these classifiers, a weight vector is learned to balance these classifiers. Experiments on several benchmark datasets well demonstrate that the proposed method outperforms the state-of-the-art methods.
INDEX TERMS
Kernel, Convergence, Correlation, Training, Loss measurement, Benchmark testing, Web pages
CITATION

Z. Yu, C. Zhang, Q. Hu and P. Zhu, "Independence regularized multi-label ensemble," 2017 IEEE International Conference on Multimedia and Expo (ICME), Hong Kong, Hong Kong, 2017, pp. 853-858.
doi:10.1109/ICME.2017.8019483
83 ms
(Ver 3.3 (11022016))