Nonlinear optimal stabilizing control in discrete time
Abstract
The paper discusses connexions between optimality and passivity-like properties in discrete-time. The problem is set in the framework of differential/difference representations of discrete-time dynamics. The Hamilton Jacobi Bellman equality is specified and the optimal control is implicitly described. Some particular cases for which explicit control solutions can be built are analyzed. An example concludes the paper.