DocumentCode :
1515856
Title :
Information-Based Complexity, Feedback and Dynamics in Convex Programming
Author :
Raginsky, Maxim ; Rakhlin, Alexander
Author_Institution :
Dept. of Electr. & Comput. Eng., Duke Univ., Durham, NC, USA
Volume :
57
Issue :
10
fYear :
2011
Firstpage :
7036
Lastpage :
7056
Abstract :
We study the intrinsic limitations of sequential convex optimization through the lens of feedback information theory. In the oracle model of optimization, an algorithm queries an oracle for noisy information about the unknown objective function and the goal is to (approximately) minimize every function in a given class using as few queries as possible. We show that, in order for a function to be optimized, the algorithm must be able to accumulate enough information about the objective. This, in turn, puts limits on the speed of optimization under specific assumptions on the oracle and the type of feedback. Our techniques are akin to the ones used in statistical literature to obtain minimax lower bounds on the risks of estimation procedures; the notable difference is that, unlike in the case of i.i.d. data, a sequential optimization algorithm can gather observations in a controlled manner, so that the amount of information at each step is allowed to change in time. In particular, we show that optimization algorithms often obey the law of diminishing returns: the signal-to-noise ratio drops as the optimization algorithm approaches the optimum. To underscore the generality of the tools, we use our approach to derive fundamental lower bounds for a certain active learning problem. Overall, the present work connects the intuitive notions of “information” in optimization, experimental design, estimation, and active learning to the quantitative notion of Shannon information.
Keywords :
convex programming; feedback; information theory; minimax techniques; sequential estimation; Shannon information; active learning problem; feedback information theory; information-based complexity; minimax lower bound; quantitative notion; sequential convex optimization; sequential optimization algorithm; signal-to-noise ratio; statistical literature; Accuracy; Complexity theory; Convex functions; Markov processes; Noise measurement; Optimization; Random variables; Convex optimization; Fano´s inequality; feedback information theory; hypothesis testing with controlled observations; information-based complexity; information-theoretic converse; minimax lower bounds; sequential optimization algorithms; statistical estimation;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2011.2154375
Filename :
5766746
Link To Document :
بازگشت