DocumentCode :
969163
Title :
A measure of relative entropy between individual sequences with application to universal classification
Author :
Ziv, Jacob ; Merhav, Neri
Author_Institution :
Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa, Israel
Volume :
39
Issue :
4
fYear :
1993
fDate :
7/1/1993 12:00:00 AM
Firstpage :
1270
Lastpage :
1279
Abstract :
A new notion of empirical informational divergence (relative entropy) between two individual sequences is introduced. If the two sequences are independent realizations of two finite-order, finite alphabet, stationary Markov processes, the empirical relative entropy converges to the relative entropy almost surely. This empirical divergence is based on a version of the Lempel-Ziv data compression algorithm. A simple universal algorithm for classifying individual sequences into a finite number of classes, which is based on the empirical divergence, is introduced. The algorithm discriminates between the classes whenever they are distinguishable by some finite-memory classifier for almost every given training set and almost any test sequence from these classes. It is universal in the sense that it is independent of the unknown sources
Keywords :
Markov processes; data compression; entropy; finite state machines; information theory; Lempel-Ziv data compression algorithm; empirical informational divergence; finite alphabet processes; finite order processes; finite-memory classifier; individual sequences; relative entropy; stationary Markov processes; universal classification; Classification algorithms; Data compression; Distributed computing; Entropy; Information theory; Jacobian matrices; Markov processes; Probability; Q measurement; Testing;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.243444
Filename :
243444
Link To Document :
بازگشت