Title :
Tsallis kernels on measures
Author :
Martins, André F T ; Aguiar, Pedro M Q ; Figueiredo, Mário A T
Author_Institution :
Language Technol. Inst., Carnegie Mellon Univ., Pittsburgh, PA
Abstract :
Recent approaches to classification of text, images, and other types of structured data, launched the quest for positive definite (p.d.) kernels on probability measures. In particular, kernels based on the Jensen-Shannon (JS) divergence and other information-theoretic quantities have been proposed. We introduce new JS-type divergences, by extending its two building blocks: convexity and Shannonpsilas entropy. These divergences are then used to define new information-theoretic kernels on measures. In particular, we introduce a new concept of q-convexity, for which a Jensen q-inequality is proved. Based on this inequality, we introduce the Jensen-Tsallis q-difference, a nonextensive generalization of the Jensen-Shannon divergence. Furthermore, we provide denormalization formulae for entropies and divergences, which we use to define a family of nonextensive information-theoretic kernels on measures. This family, grounded in nonextensive entropies, extends Jensen-Shannon divergence kernels, and allows assigning weights to its arguments.
Keywords :
information theory; probability; Jensen-Shannon divergence; Tsallis kernels; information-theoretic quantities; positive definite kernels; probability measures; Additives; Entropy; Image processing; Information theory; Kernel; Machine learning; Q measurement; Signal processing; Statistics; Telecommunications; Jensen-Shannon divergence; Positive definite kernels; Tsallis entropy; convexity; nonextensive entropy;
Conference_Titel :
Information Theory Workshop, 2008. ITW '08. IEEE
Conference_Location :
Porto
Print_ISBN :
978-1-4244-2269-2
Electronic_ISBN :
978-1-4244-2271-5
DOI :
10.1109/ITW.2008.4578673