System diagnostics socio-economic processes
Data Mining and Pattern Recognition
MATHEMATICAL MODELING
Ladjal Brahim Synthesis Control of a Nonlinear Dynamic Object with Limited Perturbations and Using the SDC-Method
System analysis in medicine and biology
Risk management and safety
Ladjal Brahim Synthesis Control of a Nonlinear Dynamic Object with Limited Perturbations and Using the SDC-Method
Abstract. 

The optimal control problem in the differential game problem with restrictions on the control. The problem of optimal control under the action of bounded perturbations is formulated for a class of dynamical systems whose nonlinear objects are representable as objects with a linear structure and state-dependent parameters. The introduced quadratic functional allows us to consider the problem with the use of methods of differential games with zero sum. The linearity of the structure of the transformed nonlinear system and the quadratic quality functional allow for the synthesis of optimal controls, i.e. parameters of regulators, go from the need to search for solutions to the Bellman-Isaacs equation to an equation of the Riccati type with state-dependent parameters. The synthesized controls provide the SDC-model with the property of asymptotic stability and make it possible to determine the ratio of the constraints imposed on the controls, under which the condition for the existence of a differential game with zero sum is ensured. The main problem of implementing optimal control is related to the problem of finding a solution to the Riccati equation with state-dependent parameters at the rate of operation of the object. To solve this equation, a suboptimal control method with quasi-stationary values of the parameters is proposed. An estimate of the discrepancy between optimal and suboptimal solutions is given. Example will be provided to illustrate the effectiveness and of FRE technique for design of nonlinear systems based feedback controllers.

Keywords: 

extended linearization method, Bellman-Isaacs equation, Riccati equation with state-dependent parameters.

PP. 90-99.

DOI: 10.14357/20790279230209
 
References

1. Isaacs R. Differential Games. – M.: Mir, 1967. – 480 p. [Isaacs, R. Differential Games. – N.Y.: John Wiley and Sons, 1965.]
2. Pontryagin L.S. On linear differential games. 1 // Reports of the Academy of Sciences of the USSR. – 1967. – Vol. 174. – No. 6. – pp. 1278-1280.
3. Pontryagin L.S. On linear differential games. 2 // Reports of the USSR Academy of Sciences. - 1967. – Vol. 175. – No. 4. – pp. 764-766.
4. Mishchenko E.F. About some game tasks of pursuit and evasion from a meeting // Automation and telemechanic. – 1972. – No. 9. – pp. 24-30.
5. Pshenichny B.N. Necessary conditions of extremum. – M.: Nauka, 1969. – 150 p.
6. Krasovsky N.N., Subbotin A.I. Positional differential games. M.: Nauka, 1974. – 455 p.
7. A. Bryson, Ho Yu-Shi. Applied Theory of optimal Control. Moscow: Mir Publishing House, 1972. 544 p.
8. Bellman R., Angel E. Dynamic programming, and partial differential equations. – Moscow: Mir Publishing House, 1974. - 207 p.
9. Afanasyev V.N. Mathematical theory of control of nonlinear continuous dynamic systems. Moscow: KRASAND. 2020. . – 480 p.
10. Atans M. Falb P.L. Optimal control. M.: Mechanical Engineering. 1968. 764 p.
11. J.R. Cloutier, K.N. D’Souza and K.P. Mrachek. Nonlinear regulation and nonlinear control of Hinf using the state-dependent Riccati equation method. Part 1 theory. In Proceedings of the First International Conference on Nonlinear Problems in Aviation and Aerospace, Daytona Beach, Florida, 1996
12. Doyle J., Kuan Y., Prima J., Freeman R., Murray R. and Krstic M. Nonlinear Control: Comparisons and Case studies. In notes from a seminar on nonlinear control held at the American Control Conference, Albuquerque, New Mexico, 1998
13. Fedorenko R.P. Approximate solution of optimal control problems. M.: Nauka, 1978. 487 p.
 

2024-74-1
2023-73-4
2023-73-3
2023-73-2

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".