Table of Contents
Fetching ...

Empirical tail dependence functions in high dimensions: uniform linearizations and inference

Axel Bücher, Yeonjoon Choi, Katharina Effertz, Stanislav Volgushev

Abstract

The analysis of extremal dependence in high dimensions has recently attracted considerable interest. Existing methodology primarily focuses on modeling and estimation of extremal dependence structures, often supported by concentration bounds for empirical tail quantities. However, comparatively little is known about general inferential procedures in high-dimensional extremes. In this paper, we develop foundational theory enabling inference for methods based on empirical tail dependence coefficients and stable tail dependence functions. These estimators are constructed from ranks, which complicates distributional approximations since the stochastic fluctuations of the ranks interfere with those arising from the unknown tail dependence. We establish uniform linearization results for empirical stable tail dependence functions in the form of finite-sample probability bounds that quantify the error of the rank linearization uniformly over collections of coordinates. Within an asymptotic framework, these bounds allow the dimension to grow exponentially with the effective sample size while preserving the validity of the linear approximation. Moreover, we derive high-dimensional central limit theorems and establish the validity of multiplier bootstrap procedures for collections of empirical tail dependence statistics. We illustrate the usefulness of the results through two applications: uniform expansions for M-estimators of tail dependence parameters and inference for spatial isotropy based on collections of tail dependence functions..

Empirical tail dependence functions in high dimensions: uniform linearizations and inference

Abstract

The analysis of extremal dependence in high dimensions has recently attracted considerable interest. Existing methodology primarily focuses on modeling and estimation of extremal dependence structures, often supported by concentration bounds for empirical tail quantities. However, comparatively little is known about general inferential procedures in high-dimensional extremes. In this paper, we develop foundational theory enabling inference for methods based on empirical tail dependence coefficients and stable tail dependence functions. These estimators are constructed from ranks, which complicates distributional approximations since the stochastic fluctuations of the ranks interfere with those arising from the unknown tail dependence. We establish uniform linearization results for empirical stable tail dependence functions in the form of finite-sample probability bounds that quantify the error of the rank linearization uniformly over collections of coordinates. Within an asymptotic framework, these bounds allow the dimension to grow exponentially with the effective sample size while preserving the validity of the linear approximation. Moreover, we derive high-dimensional central limit theorems and establish the validity of multiplier bootstrap procedures for collections of empirical tail dependence statistics. We illustrate the usefulness of the results through two applications: uniform expansions for M-estimators of tail dependence parameters and inference for spatial isotropy based on collections of tail dependence functions..

Paper Structure

This paper contains 12 sections, 28 theorems, 340 equations, 2 tables.

Key Result

Theorem 2.1

Suppose that the following conditions are met: Then, for any fixed $T \in \mathbb{N}$, we have where Here, $\widetilde{\mathbb L}_{nj}(x_j)=\widetilde{\mathbb {L}}_n(0, \dots, 0, x_j, 0, \dots, 0)$, and $\partial_{j} L(\bm x)$ is defined as the right-hand derivative at points $\bm x$ with $x_j=0$. Moreover, we have $\widetilde{\mathbb L}_n = \sqrt k (\widetilde{L}_n - \widetilde{\mu}_n) \rights

Theorems & Definitions (60)

  • Theorem 2.1: Linearization and weak convergence for fixed $d$
  • Theorem 3.1
  • Corollary 3.2
  • proof
  • Theorem 3.3
  • Corollary 3.4
  • proof
  • Remark 3.5: Comparison of \ref{['cond:smoothness-hoelder']} and \ref{['cond:smoothness-good']}
  • Remark 3.6: On the bias term
  • Lemma 3.7
  • ...and 50 more