DocumentCode :
971115
Title :
Worst-case quadratic loss bounds for prediction using linear functions and gradient descent
Author :
Cesa-Bianchi, Nicolò ; Long, Philip M. ; Warmuth, Manfred K.
Author_Institution :
Dipartimento di Sci. dell´´Inf., Milan Univ., Italy
Volume :
7
Issue :
3
fYear :
1996
fDate :
5/1/1996 12:00:00 AM
Firstpage :
604
Lastpage :
619
Abstract :
Studies the performance of gradient descent (GD) when applied to the problem of online linear prediction in arbitrary inner product spaces. We prove worst-case bounds on the sum of the squared prediction errors under various assumptions concerning the amount of a priori information about the sequence to predict. The algorithms we use are variants and extensions of online GD. Whereas our algorithms always predict using linear functions as hypotheses, none of our results requires the data to be linearly related. In fact, the bounds proved on the total prediction loss are typically expressed as a function of the total loss of the best fixed linear predictor with bounded norm. All the upper bounds are tight to within constants. Matching lower bounds are provided in some cases. Finally, we apply our results to the problem of online prediction for classes of smooth functions
Keywords :
error analysis; functions; linear predictive coding; losses; online operation; prediction theory; sequences; a priori information; bounded norm; fixed linear predictor; gradient descent; hypotheses; inner product spaces; linear functions; lower bounds; online linear prediction; performance; smooth functions; sum of squared prediction errors; tight upper bounds; total prediction loss; worst-case quadratic loss bounds; Algorithm design and analysis; Computer science; Prediction algorithms; Predictive models; Upper bound;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.501719
Filename :
501719
Link To Document :
بازگشت