Issue No. 05 - May (2013 vol. 35)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/SNPD.2012.116
A. J. Ma , Dept. of Comput. Sci., Hong Kong Baptist Univ., Hong Kong, China
P. C. Yuen , Dept. of Comput. Sci., Hong Kong Baptist Univ., Hong Kong, China
Jian-Huang Lai , Sch. of Inf. Sci. & Technol., Sun Yat-Sen Univ., Guangzhou, China
This paper addresses the independent assumption issue in fusion process. In the last decade, dependency modeling techniques were developed under a specific distribution of classifiers or by estimating the joint distribution of the posteriors. This paper proposes a new framework to model the dependency between features without any assumption on feature/classifier distribution, and overcomes the difficulty in estimating the high-dimensional joint density. In this paper, we prove that feature dependency can be modeled by a linear combination of the posterior probabilities under some mild assumptions. Based on the linear combination property, two methods, namely, Linear Classifier Dependency Modeling (LCDM) and Linear Feature Dependency Modeling (LFDM), are derived and developed for dependency modeling in classifier level and feature level, respectively. The optimal models for LCDM and LFDM are learned by maximizing the margin between the genuine and imposter posterior probabilities. Both synthetic data and real datasets are used for experiments. Experimental results show that LCDM and LFDM with dependency modeling outperform existing classifier level and feature level combination methods under nonnormal distributions and on four real databases, respectively. Comparing the classifier level and feature level fusion methods, LFDM gives the best performance.
Joints, Mathematical model, Computational modeling, Kernel, Vectors, Linear programming, Optimization, multiple feature fusion, Linear dependency modeling, feature dependency, classifier level fusion, feature level fusion
P. C. Yuen, Jian-Huang Lai and A. J. Ma, "Linear Dependency Modeling for Classifier Fusion and Feature Combination," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 35, no. , pp. 1135-1148, 2013.