Modern Statistics by Kriging
Tomasz Suslo
TL;DR
This work develops S-statistics within a kriging framework that uses only random-variable means and covariances to minimize the mean-squared error of mean estimation. By formulating an unbiasedness-constrained kriging system with covariance structure $\rho$ and a Lagrange multiplier, it derives the linear system for weights $\omega_j^i$ and yields a generalized least-squares solution in the asymptotic regime, $\omega = \frac{\Lambda^{-1}F}{F'\Lambda^{-1}F}$. The paper analyzes asymptotic properties, showing the minimized variance approaches $\sigma^2$ and that the estimator concentrates on the field mean as $n$ grows, while auto-estimation reveals relationships between estimation variance and the underlying covariance. It also connects these results to simple statistics, noting that in white noise the unbiased variance estimator $\frac{1}{n-1}\sum (V_i - m)^2$ serves as a natural lower bound for the minimized variance and that $\hat{V}_j$ converges to the true mean in the limit. These insights link kriging-based estimation with classical variance accounting, offering a principled approach to variance estimation under random-variable-based statistics.
Abstract
We present statistics (S-statistics) based only on random variable (not random value) with a mean squared error of mean estimation as a concept of error.
