Title :
Cepstral coefficients, covariance lags, and pole-zero models for finite data strings
Author :
Ryrnes, C.I. ; Enqvist, Per ; Lindquist, Anders
Author_Institution :
Dept. of Syst. Sci. & Math., Washington Univ., St. Louis, MO, USA
fDate :
4/1/2001 12:00:00 AM
Abstract :
One of the most widely used methods of spectral estimation in signal and speech processing is linear predictive coding (LPC), LPC has some attractive features, which account for its popularity, including the properties that the resulting modeling filter (i) matches a finite window of n+1 covariance lags, (ii) is rational of degree at most n, and (iii) has stable zeros and poles. The only limiting factor of this methodology is that the modeling filter is “all-pole,” i.e., an autoregressive (AR) model. In this paper, we present a systematic description of all autoregressive moving-average (ARMA) models of processes that have properties (i)-(iii) in the context of cepstral analysis and homomorphic filtering. We show that each such an ARMA model determines and is completely determined by its finite windows of cepstral coefficients and covariance lags. We show that these nth-order windows form local coordinates for all ARMA models of degree n and that the pole-zero model can be determined from the windows as the unique minimum of a convex objective function. We refine this optimization method by first noting that the maximum entropy design of an LPC filter is obtained by maximizing the zeroth cepstral coefficient, subject to the constraint (i). More generally, we modify this scheme to a more well-posed optimization problem where the covariance data enter as a constraint and the linear weights of the cepstral coefficients are “positive”-in a sense that a certain pseudo-polynomial is positive-rather succinctly generalizing the maximum entropy method. This new problem is a homomorphic filter generalization of the maximum entropy method. providing a procedure for the design of any stable, minimum-phase modeling filter of degree less or equal to n that interpolates the given covariance window. We present an algorithm for realizing these filters in a lattice-ladder form, given the covariance window and the moving average part of the model, While we also show how to determine the moving average part using cepstral smoothing, one can make use of any good a priori estimate for the system zeros to initialize the algorithm. We conclude the paper with an example of this method, incorporating an example from the literature on ARMA modeling
Keywords :
autoregressive moving average processes; cepstral analysis; covariance analysis; ladder filters; lattice filters; linear predictive coding; maximum entropy methods; optimisation; poles and zeros; smoothing methods; speech processing; AR model; ARMA models; LPC; all-pole filter; autoregressive model; autoregressive moving-average models; cepstral coefficients; cepstral smoothing; covariance data; covariance lags; covariance window; finite data strings; homomorphic filtering; lattice-ladder form; linear predictive coding; linear weights; local coordinates; maximum entropy design; maximum entropy method; minimum-phase modeling filter; modeling filter; nth-order windows; optimization method; pole-zero model; pole-zero models; pseudo-polynomial; spectral estimation; steady-state behavior; transient behavior; well-posed optimization problem; Cepstral analysis; Entropy; Limiting; Linear predictive coding; Matched filters; Nonlinear filters; Poles and zeros; Predictive models; Signal processing; Speech processing;
Journal_Title :
Signal Processing, IEEE Transactions on