Collection
zero Useful+1
zero

optimum control

Terminology of control theory
open 5 entries with the same name
Optimal control refers to seeking a control under a given constraint condition to make the given system performance index reach the maximum (or minimum). It reflects the inevitable requirement for the orderly structure of the system to develop to a higher level. It belongs to the category of optimization, and has the same nature and theoretical basis with optimization. For a system with a given initial state, if the control factor is a function of time and there is no system state feedback, it is called open loop optimal control; if the control signal is a function of system state and system parameters or its environment, it is called adaptive control. [1]
Chinese name
optimum control
Foreign name
optimal control
Core
modern control theory
Research questions
Seeking optimal control strategy under constraints
Category
optimization
research method
Dynamic programming method, minimum principle

brief introduction

Announce
edit
The basic conditions and comprehensive methods to optimize the performance indicators of the control system can be summarized as: for a controlled dynamic system or motion process, find an optimal control scheme from a class of allowable control schemes, so that the system's motion can be transferred from an initial state to a specified target state while its performance indicator value is optimal. Such problems widely exist in the technical field or social problems.
For example, determine an optimal control mode to Spacecraft The fuel consumption is the least during the transition from one orbit to another. Optimal control theory It was formed and developed in the mid-1950s under the impetus of space technology. Proposed by American scholar R. Berman in 1957 [2] dynamic programming And the former Soviet scholar L S. Pontryagin proposed in 1958 Maximum principle The difference between the two is only about one year. It plays an important role in the formation and development of optimal control theory. The optimal control problem of linear systems with quadratic performance index is R E. Kalman put forward and solved it in the early 1960s.

Mathematical perspective

Announce
edit
Mathematically, the problem of determining the optimal control can be expressed as: under the constraints of the motion equation and the allowable control range, the extreme value (maximum or minimum) of the performance index function (called functional) with the control function and the motion state as variables can be obtained. The main methods to solve the optimal control problem are classical [3] Variational method (a mathematical method for finding the extreme value of functional), maximum principle and dynamic programming. Optimal control has been applied to synthesis and design Maximum speed control system , minimum fuel control system, minimum energy consumption control system, linear regulator, etc.
Variational theory is a powerful mathematical tool for studying optimal control problems, while classical variational theory can only solve the problem of unconstrained control. However, most problems in engineering practice are constrained control problems, so modern variational theory appears.

research method

Announce
edit
There are two methods most commonly used in modern variational theory.
One is Dynamic programming , the other is Minimum principle They can solve the variational problems with closed set constraints. It is worth pointing out that the dynamic programming method and the minimum principle are essentially analytical methods. In addition, variational method and linear quadratic control method are also analytical methods for solving optimal control problems. In addition to analytical methods, the research methods of optimal control problems also include numerical methods and gradient methods.

Relevant reference books

Announce
edit
【1】 Sun Wenyu, Xu Chengxian, Zhu Detong, Optimization Method, Higher Education Press ,2004
【2】 Wang Xiaowu, Foundation of Modern Control Theory, 2nd edition, China Machine Press ,2006
【3】 Hu Shousong, Principles of Automatic Control (5th Edition), Science Press, 2007
【4】 Liu Bao, Tang Wansheng, Modern Control Theory, 3rd edition, China Machine Press, 2006