Table of Contents
Fetching ...
Paper

A PyTorch Framework for Scalable Non-Crossing Quantile Regression

Abstract

Quantile regression is fundamental to distributional modeling, yet independent estimation of multiple quantiles frequently produces crossing -- where estimated quantile functions violate monotonicity, implying impossible negative probability densities. While Constrained Joint Quantile Regression (CJQR) elegantly enforces non-crossing by construction, existing formulations via Linear Programming exhibit complexity, rendering them impractical for large-scale applications. We present the first scalable solution using PyTorch automatic differentiation: \textbf{CJQR-ALM}, combining the \textbf{Augmented Lagrangian Method} with \textbf{differentiable pinball loss} and \textbf{L-BFGS} optimization. Our approach reduces computational complexity to , achieving near-zero crossing rates on datasets exceeding 70,000 observations within minutes. The differentiable formulation naturally extends to neural network architectures for non-linear conditional quantile estimation. Application to Student Growth Percentile calculations demonstrates practical utility for educational assessment, while simulation studies show negligible accuracy cost (RMSE increase points) relative to unconstrained estimation -- a favorable trade-off for applications requiring valid probability statements across finance, healthcare, and engineering.