• DocumentCode
    818929
  • Title

    Convergence of discretization procedures in dynamic programming

  • Author

    Bertsekas, Dimitri P.

  • Author_Institution
    University of Illinois, Urbana, IL, USA
  • Volume
    20
  • Issue
    3
  • fYear
    1975
  • fDate
    6/1/1975 12:00:00 AM
  • Firstpage
    415
  • Lastpage
    419
  • Abstract
    The computational solution of discrete-time stochastic optimal control problems by dynamic programming requires, in most cases, discretization of the state and control spaces whenever these spaces are infinite. In this short paper we consider a discretization procedure often employed in practice. Under certain compactness and Lipschitz continuity assumptions we show that the solution of the discretized algorithm converges to the solution of the continuous algorithm, as the discretization grids become finer and finer. Furthermore, any control law obtained from the discretized algorithm results in a value of the cost functional which converges to the optimal value of the problem.
  • Keywords
    Dynamic programming; Nonlinear systems, stochastic discrete-time; Optimal stochastic control; Stochastic optimal control; Concrete; Convergence; Cost function; Dynamic programming; Grid computing; Heuristic algorithms; Optimal control; Probability distribution; Stochastic processes; Veins;
  • fLanguage
    English
  • Journal_Title
    Automatic Control, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9286
  • Type

    jour

  • DOI
    10.1109/TAC.1975.1100984
  • Filename
    1100984