Table of Contents
Fetching ...

On Sharpest Tail Bounds for Functions of Tail Bounded Random Variables

Stephen Jordan Harrison

Abstract

Consider $n$ real/complex, independent/dependent random variables with respective tail bounds and $g$ a measurable function of the r.v.'s. Consider $f$ the "sharpest" tail bound of $g$ (sharpest in the sense that if $f$ were any less, then for some $X_1,...,X_n$ satisfying the conditions, $g(X_1,...,X_n)$ would not satisfy $f$). Significant research has been done to approximate $f$ often with high accuracy. These results are often of the form that for $g$ in this family and tail bounds of $X_k$ in this family, $f$ is bounded by some $f'$ with high accuracy. However, the question "what would it take to find $f$ exactly?" has received little attention, apparently even for simple cases. This is the question we try to answer. For $X_1,...,X_n$ required to be mutually independent, first the $X_k$ are simplified to be monotone on $(0,1)$ WLOG. This strengthens convergence in distribution to convergence a.e. (Skorokhod's representation theorem) and allows defining shift operators, which help reduce the space of r.v.'s one searches to find $f$ and/or the maximum measure of a subset. We do find $f$ in some special cases; however $f$ rarely has a closed form. For $X_1,...,X_n$ dependent/not necessarily independent, another reduction in the space of r.v.'s one searches to find $f$ is done.

On Sharpest Tail Bounds for Functions of Tail Bounded Random Variables

Abstract

Consider real/complex, independent/dependent random variables with respective tail bounds and a measurable function of the r.v.'s. Consider the "sharpest" tail bound of (sharpest in the sense that if were any less, then for some satisfying the conditions, would not satisfy ). Significant research has been done to approximate often with high accuracy. These results are often of the form that for in this family and tail bounds of in this family, is bounded by some with high accuracy. However, the question "what would it take to find exactly?" has received little attention, apparently even for simple cases. This is the question we try to answer. For required to be mutually independent, first the are simplified to be monotone on WLOG. This strengthens convergence in distribution to convergence a.e. (Skorokhod's representation theorem) and allows defining shift operators, which help reduce the space of r.v.'s one searches to find and/or the maximum measure of a subset. We do find in some special cases; however rarely has a closed form. For dependent/not necessarily independent, another reduction in the space of r.v.'s one searches to find is done.

Paper Structure

This paper contains 23 sections, 49 theorems, 388 equations, 4 figures.

Key Result

Theorem 1.1.1

Let $\mathcalorig{F}_j$ be a $\sigma$-algebra for $\Omega_j$ with $\sigma$-finite measure $\mu_j$ for $j=1,...,n$. Then there is a unique measure $\mu$ on the product $\sigma$-algebra $\mathcalorig{F}$ of $\Omega=\Omega_1\times \cdots \times \Omega_n$ such that for any $A_j \in \mathcalorig{F}_j$, f where for $f \in \mathcalorig{L}^1(\Omega,\mathcalorig{F},\mu)$, the iterated integral is defined r

Figures (4)

  • Figure 1: Example $1$
  • Figure 2: Informal argument for shift operators
  • Figure 3: Visualizing the sharpest tail bound
  • Figure 4: Distribution of $\mathscr{S}_2((f_k^-,f_k^+),G^k,c_k)$

Theorems & Definitions (102)

  • Theorem 1.1.1
  • Theorem 1.2.1: Hoeffding's inequality
  • Theorem 1.2.2: Hoeffding's inequality, two-sided
  • Theorem 1.2.3: General Hoeffding inequality
  • Definition 1
  • Lemma 1: uniqueness of neat r.v.'s
  • proof
  • Lemma 2: $\widetilde{X}$ lemma
  • proof
  • Lemma 3: existence of neat r.v.'s
  • ...and 92 more