Optimal control

Start date 01/01/2012
This project studies the solution to optimal control problems, characterized by control input affine systems, affected by stochastic disturbances and in some cases also subject to state and/or control input constraints are carried out.
Essential to this project are techniques to solve the so called Stochastic Hamilton-Jacobi-Bellmann equation, a partial differential equation, when solved giving the control policy as function of state and, in case of non-stationary problems, of time as well. Analytical as well as numerical approaches are of interest, albeit the former are only possible in very simple applications.

Applications of the methods have so far been found in energy storing, in the control of pendulums, in the control of exothermal batch reactors and in the control of some waste water treatment processes, however, as this project has its emphasize in theory, clearly defined relations to specific problems in society are not immediately obvious.


Figure: Solution to the HJB equation for an inverted pendulum

Published: Tue 07 Mar 2017.