MINIMAL TIME PROBLEM WITH IMPULSIVE CONTROLS
Résumé
Time optimal control problems for systems with impulsive controls are investigated. Sufficient conditions for the existence of time optimal controls are given. A dynamical programming principle is derived and Lipschitz continuity of an appropriately defined value functional is estab-lished. The value functional satisfies a Hamilton-Jacobi-Bellman equation in the viscosity sense. A numerical example for a rider-swing system is presented and it is shown that the reachable set is enlargered by allowing for impulsive controls, when compared to nonimpulsive controls.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...