DocumentCode :
1235326
Title :
An Introduction to Dynamic Programming
Author :
Kirk, Donald E.
Volume :
10
Issue :
4
fYear :
1967
Firstpage :
212
Lastpage :
219
Abstract :
Optimal control theory is introduced at a level appropriate for the undergraduate student. The principle of optimality is derived and used to develop the computational algorithm of dynamic programming. Two numerical examples, a routing problem and an optimal control problem, are solved to illustrate the basic concepts and the computational procedure. Supplementary problems with partial solutions are provided.
Keywords :
Control system synthesis; Control systems; Control theory; Cost function; Dynamic programming; Heuristic algorithms; Kirk field collapse effect; Optimal control; Routing; System analysis and design;
fLanguage :
English
Journal_Title :
Education, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9359
Type :
jour
DOI :
10.1109/TE.1967.4320291
Filename :
4320291
Link To Document :
بازگشت