Methods and models in economy
Динамические системы
Valery N. Afanas’ev, Nataly A. Frolova Differential game in the problem of controlling a nonlinear object with restrictions on control actions
Applied aspects in informatics
Системный анализ в медицине
Valery N. Afanas’ev, Nataly A. Frolova Differential game in the problem of controlling a nonlinear object with restrictions on control actions
Abstract. 

The optimal control problem in the differential game problem with restrictions on the control actions for a class of controlled dynamic systems whose nonlinear objects which can be represented as objects with a linear structure and state-dependent parameters (SDC-model) is formulated. The linearity of the structure of the transformed nonlinear system and the quadratic functional quality of a special kind allow for the synthesis of optimal control, i.e. finding the parameters of the controller, go from the need to search for solutions of the Bellman-Isaacs equation to the equation of Riccati type with state-dependent parameters. The synthesized controls provide the SDC-model with the property of asymptotic stability and allow one to determine the ratio of constraints imposed on the controls under which the condition for the existence of a differential game with zero sum. As an illustration of the results obtained, a simulation of the behavior of a nonlinear system with two players on an infinite control interval (with an open horizon) is given.

Keywords: 

extended linearization method, non-classical functional, Bellman-Isaacs equation, Riccati equation with state-dependent parameters.

DOI: 10.14357/20790279200307

PP. 56-64.
 
References

1. Krasovsky A.A. Integral estimates of moments and synthesis of nonlinear regulators // Automation and Remote Control - 1976. - No. 10. - P. 53-71.
2. Krasovsky A.A. New Solution to the Problem of a Control System Analytical Design // Automatica. - 1971. - No. 1. - P. 45-50.
3. Reference on the theory of automatic control. Ed. Krasovsky A.A. M .: - Science.1987. 712 p.
4. Krasovsky A.A. Nonclassical objective functionals and problems of the theory of optimal control (review) // Izv. USSR Academy of Sciences. Tech. cybernetics. 1992. No1. P. 3-41
5. Krasovsky A.A., Bukov V.N., Shendrik V.S. Universal algorithms for optimal control of continuous processes. - M .: Science. 1977. 274 p.
6. Beloglazov I.N. A new approach to the optimization of continuous nonlinear dynamic systems based on non-classical objective functional. Automation and Remote Control. 2001, No. 7, P. 37–49.
7. Cimen T.D. State-Dependent Riccati Equation (SDRE) Control: A Survey // Proc. 17th World Conf. IFAC, Seoul, Korea, July 6-11. 2008. 3771-3775 P.
8. Cloutier J.R. State dependent Riccati equation techniques: An overview. In Proc. American Control Conf., 1997.
9. Afanasyev V.N. Control of non-linear uncertain dynamic objects. - M .: LENARD. 2015.- 224 p.
10. Bellman R., Angel E. Dynamic programming and partial differential equations. - M.: Mir Publishing House, 1974. - 207 p.
11. Isaacs R. Differential Games. – New York: John &Wiley, 1971. – 480 p.
12. Athans M., Falb P.L. Optimal Control. McGraw- Hill Book Comp. 1964. 764 p.
13. Galeev E.M., Zelikin M.Yu., Konyagin S.V. and others. Optimal control / Ed. N.P. Osmolovsky and V.M. Tikhomirov. - M.: MCCNMO, 2008. 320 p.
14. Malkin I.G. Theory of sustainable motion. Vol. 2. M .: URSS editorial. 2004 . 432 p.
 
2024-74-1
2023-73-4
2023-73-3
2023-73-2

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".