The Community for Technology Leaders
Green Image
Issue No. 12 - December (2010 vol. 22)
ISSN: 1041-4347
pp: 1738-1751
Huanhuan Chen , University of Birmingham, Birmingham
Xin Yao , University of Birmingham, Birmingham
Negative Correlation Learning (NCL) [CHECK END OF SENTENCE], [CHECK END OF SENTENCE] is a neural network ensemble learning algorithm which introduces a correlation penalty term to the cost function of each individual network so that each neural network minimizes its mean-square-error (MSE) together with the correlation. This paper describes NCL in detail and observes that the NCL corresponds to training the entire ensemble as a single learning machine that only minimizes the MSE without regularization. This insight explains that NCL is prone to overfitting the noise in the training set. The paper analyzes this problem and proposes the multiobjective regularized negative correlation learning (MRNCL) algorithm which incorporates an additional regularization term for the ensemble and uses the evolutionary multiobjective algorithm to design ensembles. In MRNCL, we define the crossover and mutation operators and adopt nondominated sorting algorithm with fitness sharing and rank-based fitness assignment. The experiments on synthetic data as well as real-world data sets demonstrate that MRNCL achieves better performance than NCL, especially when the noise level is nontrivial in the data set. In the experimental discussion, we give three reasons why our algorithm outperforms others.
Multiobjective algorithm, multiobjective learning, neural network ensembles, neural networks, negative correlation learning, regularization.

H. Chen and X. Yao, "Multiobjective Neural Network Ensembles Based on Regularized Negative Correlation Learning," in IEEE Transactions on Knowledge & Data Engineering, vol. 22, no. , pp. 1738-1751, 2010.
97 ms
(Ver 3.3 (11022016))