Open main menu

Hamiltonian (control theory)

The Hamiltonian of optimal control theory was developed by Lev Pontryagin as part of his maximum principle.[1] It was inspired by, but is distinct from, the Hamiltonian of classical mechanics. Pontryagin proved that a necessary condition for solving the optimal control problem is that the control should be chosen so as to minimize the Hamiltonian. For details see Pontryagin's maximum principle.

Contents

Notation and Problem statementEdit

A control   is to be chosen so as to minimize the objective function

 

where   is the system state, which evolves according to the state equations

 

and the control must satisfy the constraints

 

Definition of the HamiltonianEdit

 

where   is a vector of costate variables of the same dimension as the state variables  .

For information on the properties of the Hamiltonian, see Pontryagin's maximum principle.

The Hamiltonian in discrete timeEdit

When the problem is formulated in discrete time, the Hamiltonian is defined as:

 

and the costate equations are

 

(Note that the discrete time Hamiltonian at time   involves the costate variable at time  [2] This small detail is essential so that when we differentiate with respect to   we get a term involving   on the right hand side of the costate equations. Using a wrong convention here can lead to incorrect results, i.e. a costate equation which is not a backwards difference equation).

The Hamiltonian of control compared to the Hamiltonian of mechanicsEdit

William Rowan Hamilton defined the Hamiltonian for describing the mechanics of a system. It is a function of three variables:

 

where  the Lagrangian the extremizing of which determines the dynamics (not the Lagrangian defined above),  is the state variable and  is its time derivative.

  is the so-called "conjugate momentum", defined by

 

Hamilton then formulated his equations to describe the dynamics of the system as

 
 

The Hamiltonian of control theory describes not the dynamics of a system but conditions for extremizing some scalar function thereof (the Lagrangian) with respect to a control variable  . As normally defined, it is a function of 4 variables

 

where  is the state variable and  is the control variable with respect to which we are extremizing.

The associated conditions for a maximum are

 
 
 

This definition agrees with that given by the article by Sussmann and Willems.[3] (see p. 39, equation 14). Sussmann-Willems show how the control Hamiltonian can be used in dynamics e.g. for the brachystochrone problem, but do not mention the prior work of Carathéodory on this approach.[4]

Example: Ramsey ModelEdit

Take a simplified version of the Ramsey–Cass–Koopmans model. We wish to maximize an agent's discounted lifetime utility achieved through consumption

 

subject to the time evolution of capital per effective worker

 

where   is period t consumption,   is period t capital per worker,   is period t production,   is the population growth rate,   is the capital depreciation rate, the agent discounts future utility at rate  , with   and  .

Here,   is the state variable which evolves according to the above equation, and   is the control variable. The Hamiltonian becomes

 

The optimality conditions are

 
 

If we let  , then log-differentiating the first optimality condition with respect to   yields

 

Inserting this equation into the second optimality condition yields

 

which is the Keynes–Ramsey rule or the Euler–Lagrange equation, which gives a condition for consumption in every period which, if followed, ensures maximum lifetime utility.

See alsoEdit

ReferencesEdit

  1. ^ Dixit, Avinash K. (1990). Optimization in Economic Theory. New York: Oxford University Press. pp. 145–161. ISBN 0-19-877210-6.
  2. ^ Varaiya, Chapter 6
  3. ^ Sussmann; Willems (June 1997). "300 Years of Optimal Control" (PDF). IEEE Control Systems.
  4. ^ See Pesch, H. J.; Bulirsch, R. (1994). "The maximum principle, Bellman's equation, and Carathéodory's work". Journal of Optimization Theory and Applications. 80 (2): 199–225. doi:10.1007/BF02192933.

External linksEdit