Optimal Control System

Optimal Control System


an automatic control system that ensures functioning of the object of control that is the best, or optimal, from a particular point of view. The characteristics of the object, and also the external disturbing influences, may change in an unforeseen manner but usually remain within certain limits.

The optimal functioning of a control system is described by the criterion of optimal control, also called the criterion of optimality or the target function, which is a quantity that defines the efficiency of achieving the goal of control and depends on the change in time or space of the coordinates and parameters of the system. Various technical and economic indexes of the functioning of the object may be the criterion of optimality; among them are efficiency, speed of operation, average or maximum deviation of system parameters from assigned values, prime cost of the product, and certain indexes of product quality or a generalized quality index. The criterion of optimality may apply to a transient process, a stable process, or both.

A distinction is made between regular and statistical criteria of optimality. Regular criteria depend on regular parameters and on the coordinates of the controlled and controlling systems. Statistical criteria are used when the input signals are random functions and/or when random disturbances generated by certain elements of the system must be taken into account. In terms of a mathematical description, the criterion of optimality may be either a function of a finite number of parameters and coordinates of the controlled process, which assumes an extreme value when the system is functioning optimally, or a functional of the function that describes the control rule; in this case, the form of the function for which the functional assumes an extreme value is determined. Pontriagin’s maximum principle or the theory of dynamic programming is used to calculate an optimal system.

Optimal functioning of complex objects is achieved by using adaptive control systems, which, while functioning, are capable of automatically changing their control algorithms, characteristics, or structure to maintain a constant criterion of optimality with randomly changing parameters and conditions of operation of the system. In the general case, therefore, an optimal system consists of two parts: the constant (invariable) part, which includes the object of control and certain elements of the control system, and the variable part, which includes the other elements.


References in periodicals archive ?
An optimal control system seeks to maximize the return from a system for minimum cost.
Through a combination of subscription products, onsite services, and client education, FoxGuard gives operators the tools needed to achieve optimal control system performance and comply with federal regulations.
An optimal control system is then designed to predict and achieve ideal truck velocity and/or engine speed, based on the road geometry, with the consideration of fuel consumption and travel time.
The supervisory optimal control system is installed and connected to a conventional BEMS's control computer, as shown in Figure 4.
The system consists of vehicle state estimators, the road geometry, and an optimal control system.
Writing for control students and engineers, they examine such topics as orthogonal functions and their properties, linear optimal control systems incorporating observers, linear-quadratic-Gaussian control, the optimal control of time-delay systems, and the hierarchical control of linear systems.
More precisely, we generalize the concept of nonessential objective functions to functionals of optimal control systems and give the first steps on the corresponding theory.

Full browser ?