Table of Contents
Fetching ...

Lévy-Flow Models: Heavy-Tail-Aware Normalizing Flows for Financial Risk Management

Rachid Drissi

Abstract

We introduce Lévy-Flows, a class of normalizing flow models that replace the standard Gaussian base distribution with Lévy process-based distributions, specifically Variance Gamma (VG) and Normal-Inverse Gaussian (NIG). These distributions naturally capture heavy-tailed behavior while preserving exact likelihood evaluation and efficient reparameterized sampling. We establish theoretical guarantees on tail behavior, showing that for regularly varying bases the tail index is preserved under asymptotically linear flow transformations, and that identity-tail Neural Spline Flow architectures preserve the base distribution's tail shape exactly outside the transformation region. Empirically, we evaluate on S&P 500 daily returns and additional assets, demonstrating substantial improvements in density estimation and risk calibration. VG-based flows reduce test negative log-likelihood by 69% relative to Gaussian flows and achieve exact 95% VaR calibration, while NIG-based flows provide the most accurate Expected Shortfall estimates. These results show that incorporating Lévy process structure into normalizing flows yields significant gains in modeling heavy-tailed data, with applications to financial risk management.

Lévy-Flow Models: Heavy-Tail-Aware Normalizing Flows for Financial Risk Management

Abstract

We introduce Lévy-Flows, a class of normalizing flow models that replace the standard Gaussian base distribution with Lévy process-based distributions, specifically Variance Gamma (VG) and Normal-Inverse Gaussian (NIG). These distributions naturally capture heavy-tailed behavior while preserving exact likelihood evaluation and efficient reparameterized sampling. We establish theoretical guarantees on tail behavior, showing that for regularly varying bases the tail index is preserved under asymptotically linear flow transformations, and that identity-tail Neural Spline Flow architectures preserve the base distribution's tail shape exactly outside the transformation region. Empirically, we evaluate on S&P 500 daily returns and additional assets, demonstrating substantial improvements in density estimation and risk calibration. VG-based flows reduce test negative log-likelihood by 69% relative to Gaussian flows and achieve exact 95% VaR calibration, while NIG-based flows provide the most accurate Expected Shortfall estimates. These results show that incorporating Lévy process structure into normalizing flows yields significant gains in modeling heavy-tailed data, with applications to financial risk management.

Paper Structure

This paper contains 33 sections, 3 theorems, 17 equations, 4 figures, 7 tables.

Key Result

Theorem 1

Let $Z$ be a real-valued random variable with regularly varying tail: where $L$ is slowly varying. Let $f: \mathbb{R} \to \mathbb{R}$ satisfy: Then $X = f(Z)$ is also regularly varying with index $\alpha$: where $\tilde{L}(x) = c^\alpha L(x)$ is slowly varying. $\blacktriangleleft$$\blacktriangleleft$

Figures (4)

  • Figure 1: Hill estimator plot for S&P 500 returns. The estimated exponent of approximately 2.5 indicates substantially heavier tails than the Gaussian distribution. The Hill estimator assumes power-law decay; this estimate should be read as evidence that heavy-tailed bases are warranted, not as a claim about the asymptotic tail form of the fitted model.
  • Figure 2: Density comparison on S&P 500 returns. Lévy-Flows (VG, NIG) capture the peaked center and heavy tails better than Gaussian-based flows.
  • Figure 3: Tail comparison (log scale). The Lévy-Flow models maintain probability mass in the tails, while light-tailed models underestimate extreme event probabilities.
  • Figure 4: QQ plots comparing model samples to empirical data. The Lévy-Flow (left) tracks the diagonal more closely in the tails, while the Gaussian-Flow (right) systematically underestimates extreme quantiles.

Theorems & Definitions (7)

  • Definition 1: Regular Variation
  • Theorem 1: Tail Index Preservation under Asymptotically Linear Flows
  • proof
  • Corollary 1: Application to Neural Spline Flows
  • Proposition 1: Identity-Tail Preservation for Arbitrary Bases
  • proof
  • Remark 1: Sensitivity to Tail Bound $B$