Table of Contents
Fetching ...

An Ostrowski Type Inequality for Convex Functions

Sever Silvestru Dragomir

TL;DR

This work extends Ostrowski-type inequalities to convex functions, establishing a sharp lower bound for the deviation between the integral mean and pointwise evaluation, and deriving a Hermite–Hadamard refinement with a sharp 1/8 constant. It then develops a composite-case framework with explicit bounds on quadrature remainder terms, and extends the analysis to integral means, providing practical bounds for mean differences and special means. The paper further demonstrates applications to probability density functions and introduces Ostrowski-type bounds for HH-divergence, yielding quantitative estimates that refine comparisons between $f$-divergence and Hermite-Hadamard divergence. Collectively, these results sharpen quadrature error estimates, link convexity-based inequalities to probabilistic and informational measures, and offer tools for analyzing divergence in information theory.

Abstract

An Ostrowski type integral inequality for convex functions and applications for quadrature rules and integral means are given. A refinement and a counterpart result for Hermite-Hadamard inequalities are obtained and some inequalities for pdf's and (HH)-divergence measure are also mentioned.

An Ostrowski Type Inequality for Convex Functions

TL;DR

This work extends Ostrowski-type inequalities to convex functions, establishing a sharp lower bound for the deviation between the integral mean and pointwise evaluation, and deriving a Hermite–Hadamard refinement with a sharp 1/8 constant. It then develops a composite-case framework with explicit bounds on quadrature remainder terms, and extends the analysis to integral means, providing practical bounds for mean differences and special means. The paper further demonstrates applications to probability density functions and introduces Ostrowski-type bounds for HH-divergence, yielding quantitative estimates that refine comparisons between -divergence and Hermite-Hadamard divergence. Collectively, these results sharpen quadrature error estimates, link convexity-based inequalities to probabilistic and informational measures, and offer tools for analyzing divergence in information theory.

Abstract

An Ostrowski type integral inequality for convex functions and applications for quadrature rules and integral means are given. A refinement and a counterpart result for Hermite-Hadamard inequalities are obtained and some inequalities for pdf's and (HH)-divergence measure are also mentioned.

Paper Structure

This paper contains 6 sections, 17 theorems, 68 equations.

Key Result

Theorem 1

Let $f:\left[ a,b\right] \rightarrow \mathbb{R}$ be a differentiable mapping on $\left( a,b\right)$ with the property that $\left| f^{\prime }\left( t\right) \right| \leq M$ for all $t\in \left( a,b\right)$. Then for all $x\in \left[ a,b\right]$. The constant $\frac{1}{4}$ is the best possible in the sense that it cannot be replaced by a smaller constant.

Theorems & Definitions (27)

  • Theorem 1
  • Theorem 2
  • Theorem 3
  • Theorem 4
  • Theorem 6
  • proof
  • Corollary 1
  • Corollary 2
  • Remark 1
  • Theorem 7
  • ...and 17 more