DocumentCode :
1038404
Title :
Nonparametric hypothesis tests for statistical dependency
Author :
Ihler, Alexander T. ; Fisher, John W. ; Willsky, Alan S.
Author_Institution :
Lab. for Inf. & Decision Syst., Massachusetts Inst. of Technol., Cambridge, MA, USA
Volume :
52
Issue :
8
fYear :
2004
Firstpage :
2234
Lastpage :
2249
Abstract :
Determining the structure of dependencies among a set of variables is a common task in many signal and image processing applications, including multitarget tracking and computer vision. In this paper, we present an information-theoretic, machine learning approach to problems of this type. We cast this problem as a hypothesis test between factorizations of variables into mutually independent subsets. We show that the likelihood ratio can be written as sums of two sets of Kullback-Leibler (KL) divergence terms. The first set captures the structure of the statistical dependencies within each hypothesis, whereas the second set measures the details of model differences between hypotheses. We then consider the case when the signal prior models are unknown, so that the distributions of interest must be estimated directly from data, showing that the second set of terms is (asymptotically) negligible and quantifying the loss in hypothesis separability when the models are completely unknown. We demonstrate the utility of nonparametric estimation methods for such problems, providing a general framework for determining and distinguishing between dependency structures in highly uncertain environments. Additionally, we develop a machine learning approach for estimating lower bounds on KL divergence and mutual information from samples of high-dimensional random variables for which direct density estimation is infeasible. We present empirical results in the context of three prototypical applications: association of signals generated by sources possessing harmonic behavior, scene correspondence using video imagery, and detection of coherent behavior among sets of moving objects.
Keywords :
Monte Carlo methods; image sequences; learning (artificial intelligence); nonparametric statistics; object detection; video signal processing; Kullback-Leibler divergence terms; Monte Carlo methods; coherent behavior; data association; harmonic behavior; image sequences; information theory; kernel density estimates; machine learning approach; moving objects; nonparametric estimation methods; nonparametric hypothesis tests; object interaction detection; statistical dependency; video imagery; Application software; Computer vision; Image processing; Machine learning; Mutual information; Prototypes; Random variables; Signal generators; Signal processing; Testing; Data association; Kullback–Leibler divergence; factorization; hypothesis testing; independence tests; kernel density estimates; mutual information; nonparametric;
fLanguage :
English
Journal_Title :
Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1053-587X
Type :
jour
DOI :
10.1109/TSP.2004.830994
Filename :
1315943
Link To Document :
بازگشت