Title :
Regularization in Matrix Relevance Learning
Author :
Schneider, Petra ; Bunte, Kerstin ; Stiekema, Han ; Hammer, Barbara ; Villmann, Thomas ; Biehl, Michael
Author_Institution :
Johann Bernoulli Inst. for Math. & Comput. Sci., Univ. of Groningen, Groningen, Netherlands
fDate :
5/1/2010 12:00:00 AM
Abstract :
In this paper, we present a regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can display a tendency towards oversimplification in the course of training. An overly pronounced elimination of dimensions in feature space can have negative effects on the performance and may lead to instabilities in the training. We focus on matrix learning in generalized LVQ (GLVQ). Extending the cost function by an appropriate regularization term prevents the unfavorable behavior and can help to improve the generalization ability. The approach is first tested and illustrated in terms of artificial model data. Furthermore, we apply the scheme to benchmark classification data sets from the UCI Repository of Machine Learning. We demonstrate the usefulness of regularization also in the case of rank limited relevance matrices, i.e., matrix learning with an implicit, low-dimensional representation of the data.
Keywords :
data analysis; learning (artificial intelligence); matrix algebra; vector quantisation; adaptive distance measures; classification data sets benchmarking; learning vector quantization; machine learning; matrix relevance learning; metric learning; regularization; Cost function; learning vector quantization (LVQ); metric adaptation; regularization; Algorithms; Artificial Intelligence; Feedback; Humans; Learning; Neural Networks (Computer);
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2010.2042729