A Generalised Trapezoid Type Inequality for Convex Functions
Sever Silvestru Dragomir
TL;DR
The paper develops convex function based generalisations of the trapezoid inequality, deriving a sharp lower bound $\frac{1}{2}[(b-x)^2 f'_+(x) - (x-a)^2 f'_-(x)] \le (x-a)f(a)+(b-x)f(b) - \int_a^b f(t) dt$ and a corresponding upper bound, with the identity $\int_a^b (t-x) f'(t) dt$, and Hermite-Hadamard type corollaries that include a sharp bound on $\frac{f(a)+f(b)}{2} - \frac{1}{b-a}\int_a^b f(t) dt$. It then extends these single interval results to the composite case, providing explicit bounds for the remainder $S_n$ in $\int_a^b f = G_n - S_n$ and for the trapezoid based quadrature remainder $Q_n$, with sharp constants $1/2$ and $1/8$ respectively. The paper further translates these inequalities to applications for probability density functions by bounding $E(X)$ for increasing densities and yields a mid point type bound as well. Finally, it derives analogous bounds for the Hermite-Hadamard divergence in information theory, relating the Csiszár $f$-divergence and the $HH$ divergence with explicit derivative based bounds, thereby connecting convexity based quadrature error analysis to divergence measures. Overall, the work provides rigorous, convexity driven bounds for quadrature errors, PDF expectations, and information theoretic divergences with sharp constants.
Abstract
A generalised trapezoid inequality for convex functions and applications for quadrature rules are given. A refinement and a counterpart result for the Hermite-Hadamard inequalities are obtained and some inequalities for pdf's and (HH)-divergence measure are also mentioned.
