DocumentCode :
1710651
Title :
Sequential change detection using estimators of entropy & divergence rate
Author :
Juvvadi, Deekshith R. ; Bansal, Rakesh K.
Author_Institution :
Systems Design Engineer, Broadcom Communication Technologies Pvt. Ltd., Bengaluru, India
fYear :
2013
Firstpage :
1
Lastpage :
5
Abstract :
We study the sequential change detection problem, with incomplete knowledge of source statistics, through the use of universal estimators of entropy and divergence rate. A novel technique, to reduce the time complexity of JB-Page change detection test(Jacob-Bansal(2008)), is described and a lemma justifying the method is proved. Inspired by the Page(1954) test, we propose a test, to detect a change from a stationary Markov ψ-mixing process to a stationary ergodic process. Statistics of both the sources are unknown except for a training sequence for the source before change. The test uses a universal estimator of the divergence rate between a stationary ergodic process and a stationary Markov ψ-mixing process, which we propose and prove to be almost-surely convergent. It is based on the Fixed-Database-Lempel-Ziv(FDLZ) cross-parsing technique. The proof of convergence of our estimator of divergence rate uses the almost-sure convergence of a match-length like quantity between a stationary ergodic process and a stationary Markov ψ-mixing process which we establish here.
Keywords :
Convergence; Databases; Entropy; Information theory; Markov processes; Time complexity;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Communications (NCC), 2013 National Conference on
Conference_Location :
New Delhi, India
Print_ISBN :
978-1-4673-5950-4
Electronic_ISBN :
978-1-4673-5951-1
Type :
conf
DOI :
10.1109/NCC.2013.6487918
Filename :
6487918
Link To Document :
بازگشت