Taylor's theorem

(redirected from Lagrange error bound)

Taylor's theorem

[′tā·lərz ‚thir·əm]
(mathematics)
The theorem that under certain conditions a real or complex function can be represented, in a neighborhood of a point where it is infinitely differentiable, as a power series whose coefficients involve the various order derivatives evaluated at that point.
Mentioned in ?
References in periodicals archive ?
We calculate the following Lagrange error bound for different values of Irrespectively, M5 = 0.161 x 10-2 which is 2 decimal places accuracy, Ms = 0.352 x 10-5 which is 5 decimal places accuracy, and Mio = 0.248 x 10-7 which is 7 decimal places accuracy.