Abstract :
An algorithm for the mean squared error (MSE) minimization, through the bias-to-variance ratio optimization, has been recently proposed and used in the literature. This algorithm is based on the analysis of the intersection of confidence intervals (ICIs). The algorithm does not require explicit knowledge of the estimation bias for a "near to optimal" parameter estimation. This paper presents a detailed analysis of the algorithm performances, including procedures and relations that can be used for a fine adjustment of the algorithm parameters. Reliability of the algorithm is studied for various kinds of estimation noise. Results are confirmed on a simulated example with uniform, Gaussian, and Laplacian noise. An illustration of the algorithm application on a simple filtering example is given.
Keywords :
"Performance analysis","Adaptive algorithm","Signal processing algorithms","Multidimensional signal processing","Algorithm design and analysis","Adaptive signal processing","Smoothing methods","Filtering","Time frequency analysis","Frequency estimation"