Title :
Transferred correlation learning: An incremental scheme for neural network ensembles
Author :
Jiang, Lei ; Zhang, Jian ; Allen, Gabrielle
Abstract :
Transfer learning is a new learning paradigm, in which, besides the training data for the targeted learning task, data that are related to the task (often under a different distribution) are also employed to help train a better learner. For example, out-dated data can be used as such related data. In this paper, we propose a new transfer learning framework for training neural network (NN) ensembles. The framework has two key features: 1) it uses the well-known negative correlation learning to train an ensemble of diverse neural networks from the related data, fully discovering the knowledge in the data; and 2) a penalized incremental learning scheme is used to adapt the neural networks obtained from negative correlation learning to the training data for the targeted learning task. The adaptation is guided by reference neural networks that measure the relatedness between the training and the related data. Experiments on benchmark data sets show that our framework can achieve classification accuracy competitive to existing ensemble transfer learning methods such as TrAdaBoost and TrBagg. We discuss some characteristics of our framework observed in the experiment and the scenarios under which the framework may have superior performance.
Keywords :
data mining; learning (artificial intelligence); neural nets; pattern classification; TrAdaBoost; TrBagg; incremental learning scheme; knowledge discovery; training neural network ensembles; transferred correlation learning; well-known negative correlation learning; Accuracy; Artificial neural networks; Correlation; Degradation; Machine learning; Training; Training data;
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
Print_ISBN :
978-1-4244-6916-1
DOI :
10.1109/IJCNN.2010.5596617