principle of optimality

(redirected from Bellman equation)
Also found in: Wikipedia.

principle of optimality

[′prin·sə·pəl əv ‚äp·tə′mal·əd·ē]
(control systems)
A principle which states that for optimal systems, any portion of the optimal state trajectory is optimal between the states it joins.
References in periodicals archive ?
m] [right arrow] R and satisfying the following Bellman equation,
We proceed by deriving the Bellman equation for J analogous to the Bellman equation for V we derived earlier.
Defining the value function V(m,n) in the obvious way, the Bellman equation is
The Bellman equation associated to the problem defined in Eq.
When the exponential decay, Poisson event and information cost are jointly considered, the Bellman equation becomes:
The somewhat unusual Bellman equation for the dual problem can be written
David Romer, an economist at the University of California at Berkeley, used it while looking at the first quarters of some 700 NFL games and has written a paper, "It's Fourth Down and What Does the Bellman Equation Say?
This section presents the Taylor series expansion of the numerical solutions of the Bellman equation for the borrowed reserves function.
From this, the Bellman equation (equation (7)) is derived which determines the bank's lending and discount-borrowing behavior.
m] [right arrow] Rand satisfying the Bellman equation (6):
We will refer to this condition as the Bellman equation for the value function V.
A Bellman equation instructs the decisionmaker to vary the policy instrument in order to generate information about unknown parameters and model probabilities.