Title :
Likelihood-based regularization and differential log-likelihood in kernel-based topographic map formation
Author_Institution :
Laboratorium voor Neuro-en Psychofysiologie, KU Leuven
fDate :
Sept. 29 2004-Oct. 1 2004
Abstract :
Two new principles for kernel-based density estimation and kernel-based topographic map formation are introduced: likelihood-based regularization and differential log-likelihood. The former makes that every kernel has an equal probability of generating data points. The differential log-likelihood is an unbiased metric with which to judge the quality of the density estimate. We apply these principles to kernel-based topographic map formation based on log-likelihood maximization. We restrict ourselves to Gaussian kernels and homogeneous, homoscedastic mixings. We show that the negative log-likelihood equals the quantization error of the map up to a scale factor
Keywords :
Gaussian processes; entropy; learning (artificial intelligence); maximum likelihood estimation; Gaussian kernels; density estimate; differential log-likelihood; homoscedastic mixings; kernel-based density estimation; kernel-based topographic map formation; likelihood-based regularization; log-likelihood maximization; negative log-likelihood; quantization error; scale factor; unbiased metric; Birth disorders; Clustering algorithms; Entropy; Kernel; Laboratories; Lattices; Machine learning; Machine learning algorithms; Psychology; Quantization;
Conference_Titel :
Machine Learning for Signal Processing, 2004. Proceedings of the 2004 14th IEEE Signal Processing Society Workshop
Conference_Location :
Sao Luis
Print_ISBN :
0-7803-8608-4
DOI :
10.1109/MLSP.2004.1422954