Title :
Latent learning - What your net also learned
Author :
Gutstein, Steven ; Fuentes, Olac ; Freudenthal, Eric
Author_Institution :
Dept. of Comput. Sci., Univ. of Texas at El Paso, El Paso, TX, USA
fDate :
July 31 2011-Aug. 5 2011
Abstract :
A neural net can learn to discriminate among a set of classes without explicitly training to do so. It does not even need exposure to any instances of those classes. The learning occurs while the net is being trained to discriminate among a set of related classes. This form of transfer learning is referred to as `Latent Learning´ by psychologists, because until specifically elicited, the knowledge remains latent. Evidence that latent learning has occurred lies in the existence of consistent, unique responses to the unseen classes. Standard supervised learning can improve the accuracy of those responses with exceedingly small sets of labeled images. In this paper, we use a convolutional neural net (CNN) to demonstrate not only a method of determining a net´s latent responses, but also simple ways to optimize latent learning. Additionally, we take advantage of the fact that CNN´s are deep nets in order to show how the latently learned accuracy of the CNN may be greatly improved by allowing only its output layer to train. We compare our results both to those obtained with standard backpropagation training of the CNN on small datasets without any transfer learning and to a related set of current published results.
Keywords :
learning (artificial intelligence); neural nets; backpropagation training; convolutional neural net; latent learning; supervised learning; transfer learning; Accuracy; Backpropagation; Encoding; Handwriting recognition; Psychology; Semantics; Training;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033376