DocumentCode :
788400
Title :
Universal linear least squares prediction: upper and lower bounds
Author :
Singer, Andrew C. ; Kozat, Suleyman S. ; Feder, Meir
Author_Institution :
Dept. of Electr. & Comput. Eng., Illinois Univ., Urbana, IL, USA
Volume :
48
Issue :
8
fYear :
2002
fDate :
8/1/2002 12:00:00 AM
Firstpage :
2354
Lastpage :
2362
Abstract :
We consider the problem of sequential linear prediction of real-valued sequences under the square-error loss function. For this problem, a prediction algorithm has been demonstrated whose accumulated squared prediction error, for every bounded sequence, is asymptotically as small as the best fixed linear predictor for that sequence, taken from the class of all linear predictors of a given order p. The redundancy, or excess prediction error above that of the best predictor for that sequence, is upper-bounded by A2P ln(n)/n, where n is the data length and the sequence is assumed to be bounded by some A. We provide an alternative proof of this result by connecting it with universal probability assignment. We then show that this predictor is optimal in a min-max sense, by deriving a corresponding lower bound, such that no sequential predictor can ever do better than a redundancy of A2p ln(n)/n.
Keywords :
least squares approximations; minimax techniques; prediction theory; probability; accumulated squared prediction error; bounded sequence; data length; excess prediction error; lower bound; min-max optimal predictor; prediction algorithm; real-valued sequences; redundancy; sequential linear prediction; sequential predictor; square-error loss function; universal linear least squares prediction; universal probability assignment; upper bound; Engineering profession; Joining processes; Least squares methods; Neural networks; Prediction algorithms; Redundancy; Source coding; Vectors;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2002.800489
Filename :
1019843
Link To Document :
بازگشت