Title :
Efficient representation ranking for transfer learning
Author :
Son N. Tran;Artur d´Avila Garcez
Author_Institution :
Department of Computer Science, City University London, Northampton Square, UK, EC1V 0HB
fDate :
7/1/2015 12:00:00 AM
Abstract :
Representation learning has emerged recently as a useful tool in the extraction of features from data. In a range of applications, features learned from data have been shown superior to their hand-crafted counterpart. Many deep learning approaches have taken advantage of such feature extraction. However, further research is needed on how such features can be evaluated for re-use in related applications, hopefully then improving performance on such applications. In this paper, we present a new method for ranking the representations learned by a Restricted Boltzmann Machine, which has been used regularly as a feature learner by deep networks. We show that high-ranking features, according to our method, should capture more information than low-ranking ones. We then apply representation ranking for pruning the network, and propose a new transfer learning algorithm, which uses such features extracted from a trained network to improve learning performance in another network trained on an analogous domain. We show that by transferring a small number of highest scored representations from source domain our method encourages the learning of new knowledge in target domain while preserving most of the information of the source domain during the transfer. This transfer learning is similar to self-taught learning in that it does not use the source domain data during the transfer process.
Keywords :
"Feature extraction","Detectors","Optimization"
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
DOI :
10.1109/IJCNN.2015.7280454