Title :
Fast Computation of the Kullback–Leibler Divergence and Exact Fisher Information for the First-Order Moving Average Model
Author :
Makalic, Enes ; Schmidt, Daniel F.
Author_Institution :
Centre for MEGA Epidemiology, Univ. of Melbourne, Carlton, VIC, Australia
fDate :
4/1/2010 12:00:00 AM
Abstract :
In this note expressions are derived that allow computation of the Kullback-Leibler (K-L) divergence between two first-order Gaussian moving average models in O n(1) time as the sample size n ?? ??. These expressions can also be used to evaluate the exact Fisher information matrix in On(1) time, and provide a basis for an asymptotic expression of the K-L divergence.
Keywords :
Gaussian processes; information theory; moving average processes; Kullback-Leibler divergence; first-order Gaussian moving average models; fisher information; Fisher information; Kullback–Leibler divergence; moving average models;
Journal_Title :
Signal Processing Letters, IEEE
DOI :
10.1109/LSP.2009.2039659