Title :
An Introduction to Dynamic Programming
Abstract :
Optimal control theory is introduced at a level appropriate for the undergraduate student. The principle of optimality is derived and used to develop the computational algorithm of dynamic programming. Two numerical examples, a routing problem and an optimal control problem, are solved to illustrate the basic concepts and the computational procedure. Supplementary problems with partial solutions are provided.
Keywords :
Control system synthesis; Control systems; Control theory; Cost function; Dynamic programming; Heuristic algorithms; Kirk field collapse effect; Optimal control; Routing; System analysis and design;
Journal_Title :
Education, IEEE Transactions on
DOI :
10.1109/TE.1967.4320291