Table of Contents
Fetching ...

A Generalised Trapezoid Type Inequality for Convex Functions

Sever Silvestru Dragomir

TL;DR

The paper develops convex function based generalisations of the trapezoid inequality, deriving a sharp lower bound $\frac{1}{2}[(b-x)^2 f'_+(x) - (x-a)^2 f'_-(x)] \le (x-a)f(a)+(b-x)f(b) - \int_a^b f(t) dt$ and a corresponding upper bound, with the identity $\int_a^b (t-x) f'(t) dt$, and Hermite-Hadamard type corollaries that include a sharp bound on $\frac{f(a)+f(b)}{2} - \frac{1}{b-a}\int_a^b f(t) dt$. It then extends these single interval results to the composite case, providing explicit bounds for the remainder $S_n$ in $\int_a^b f = G_n - S_n$ and for the trapezoid based quadrature remainder $Q_n$, with sharp constants $1/2$ and $1/8$ respectively. The paper further translates these inequalities to applications for probability density functions by bounding $E(X)$ for increasing densities and yields a mid point type bound as well. Finally, it derives analogous bounds for the Hermite-Hadamard divergence in information theory, relating the Csiszár $f$-divergence and the $HH$ divergence with explicit derivative based bounds, thereby connecting convexity based quadrature error analysis to divergence measures. Overall, the work provides rigorous, convexity driven bounds for quadrature errors, PDF expectations, and information theoretic divergences with sharp constants.

Abstract

A generalised trapezoid inequality for convex functions and applications for quadrature rules are given. A refinement and a counterpart result for the Hermite-Hadamard inequalities are obtained and some inequalities for pdf's and (HH)-divergence measure are also mentioned.

A Generalised Trapezoid Type Inequality for Convex Functions

TL;DR

The paper develops convex function based generalisations of the trapezoid inequality, deriving a sharp lower bound and a corresponding upper bound, with the identity , and Hermite-Hadamard type corollaries that include a sharp bound on . It then extends these single interval results to the composite case, providing explicit bounds for the remainder in and for the trapezoid based quadrature remainder , with sharp constants and respectively. The paper further translates these inequalities to applications for probability density functions by bounding for increasing densities and yields a mid point type bound as well. Finally, it derives analogous bounds for the Hermite-Hadamard divergence in information theory, relating the Csiszár -divergence and the divergence with explicit derivative based bounds, thereby connecting convexity based quadrature error analysis to divergence measures. Overall, the work provides rigorous, convexity driven bounds for quadrature errors, PDF expectations, and information theoretic divergences with sharp constants.

Abstract

A generalised trapezoid inequality for convex functions and applications for quadrature rules are given. A refinement and a counterpart result for the Hermite-Hadamard inequalities are obtained and some inequalities for pdf's and (HH)-divergence measure are also mentioned.

Paper Structure

This paper contains 5 sections, 16 theorems, 57 equations.

Key Result

Theorem 1

Let $f:\left[ a,b\right] \rightarrow \mathbb{R}$ be a function of bounded variation. We have the inequality holding for all $x\in \left[ a,b\right] ,$ where $\bigvee_{a}^{b}\left( f\right)$ denotes the total variation of $f$ on the interval $\left[ a,b\right]$. The constant $\frac{1}{2}$ is the best possible one.

Theorems & Definitions (23)

  • Theorem 1
  • Theorem 2
  • Theorem 3
  • Theorem 4
  • Theorem 5
  • proof
  • Corollary 1
  • Corollary 2
  • Remark 1
  • Theorem 6
  • ...and 13 more