CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2000 vol.22 Issue No.02 - February
Issue No.02 - February (2000 vol.22)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.825759
<p><b>Abstract</b>—With a focus on classification problems, this paper presents a new method for linearly combining multiple neural network classifiers based on statistical pattern recognition theory. In our approach, several neural networks are first selected based on which works best for each class in terms of minimizing classification errors. Then, they are linearly combined to form an ideal classifier that exploits the strengths of the individual classifiers. In this approach, the minimum classification error (MCE) criterion is utilized to estimate the optimal linear weights. In this formulation, because the classification decision rule is incorporated into the cost function, a more suitable better combination of weights for the classification objective could be obtained. Experimental results using artificial and real data sets show that the proposed method can construct a better combined classifier that outperforms the best single classifier in terms of overall classification errors for test data.</p>
Pattern classification, ensemble learning, linear combination, minimum classification error discriminant, neural network.
Naonori Ueda, "Optimal Linear Combination of Neural Networks for Improving Classification Performance", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.22, no. 2, pp. 207-215, February 2000, doi:10.1109/34.825759