Jensen's inequality

(redirected from Jensen inequality)

Jensen's inequality

[′jen·sənz ‚in·i′kwäl·ədē]
(mathematics)
A general inequality satisfied by a convex function where the xi are any numbers in the region where ƒ is convex and the ai are nonnegative numbers whose sum is equal to 1.
If a1, a2, …, an are positive numbers and s > t > 0, then (a1 s + a2 s + ⋯ + an s )1/ s is less than or equal to (a1 t + a2 t + ⋯ + an t )1/ t .
References in periodicals archive ?
Recently, the Jensen inequality, the improvement of the Jensen inequality, and their converses are given for time scale integrals (see [2-4]).
As discussed in [26], reciprocally convex approach is an effective approach in handling the double integral terms of the LK functional for delayed systems, which can achieve performance behavior identical to the approaches based on the integral inequality lemma but with much less decision variables, comparable to those based on the Jensen inequality lemma.
From (11), (47), and (65) and using dual OrliczMinkowski inequality, Jensen inequality, and Holder inequality, we obtain
Recently Seuret has proposed a new inequality called Wirtinger-based integral inequality in [32], which can provide more accurate estimation than the Jensen inequality. So, if the inequality is employed for investigating the singular neutral systems, we can derive an improved result.
for all affine combinations in A and that every convex function f: C [right arrow] R satisfies the Jensen inequality
Furthermore, both the discrete-time Jensen inequality and the lower bound lemma are adopted to handle the summation terms.
Recently, a new multiple integral inequality was introduced following a similar line as in proof of the Jensen inequality in [12], and a novel delay-dependent stability criterion was established, which has unfortunately observed that the computational burden is slightly heavy.
The following inequality is the integral analogue of another companion inequality to the Jensen inequality.
It is clear that this new inequality encompasses the Jensen inequality. It is also worth noting that it plays an important role in getting the derivative of the LKF.
In order to estimate such upper bounds of derivative of Lyapunov-Krasovskii functional, various mathematical tools have been used such as the Jensen inequality [3, 4, 7], lower bound lemma for reciprocal convexity [8-11], delay partitioning method [11], and free-weighting matrix variables method [6, 12, 13].
For the implication a) [??] c), we apply Jensen inequality for g and we obtain