Table of Contents
Fetching ...

Cone-Induced Geometry and Sampling for Determinantal PSD-Weighted Graph Models

Papri Dey

Abstract

We study determinantal PSD-weighted graph models in which edge parameters lie in a product positive semidefinite cone and the block graph Laplacian generates the log-det energy \[ Φ(W)=-\log\det(L(W)+R). \] The model admits explicit directional derivatives, a Rayleigh-type factorization, and a pullback of the affine-invariant log-det metric, yielding a natural geometry on the PSD parameter space. In low PSD dimension, we validate this geometry through rank-one probing and finite-difference curvature calibration, showing that it accurately ranks locally sensitive perturbation directions. We then use the same metric to define intrinsic Gibbs targets and geometry-aware Metropolis-adjusted Langevin proposals for cone-supported sampling. In the symmetric positive definite setting, the resulting sampler is explicit and improves sampling efficiency over a naive Euclidean-drift baseline under the same target law. These results provide a concrete, mathematically grounded computational pipeline from determinantal PSD graph models to intrinsic geometry and practical cone-aware sampling.

Cone-Induced Geometry and Sampling for Determinantal PSD-Weighted Graph Models

Abstract

We study determinantal PSD-weighted graph models in which edge parameters lie in a product positive semidefinite cone and the block graph Laplacian generates the log-det energy The model admits explicit directional derivatives, a Rayleigh-type factorization, and a pullback of the affine-invariant log-det metric, yielding a natural geometry on the PSD parameter space. In low PSD dimension, we validate this geometry through rank-one probing and finite-difference curvature calibration, showing that it accurately ranks locally sensitive perturbation directions. We then use the same metric to define intrinsic Gibbs targets and geometry-aware Metropolis-adjusted Langevin proposals for cone-supported sampling. In the symmetric positive definite setting, the resulting sampler is explicit and improves sampling efficiency over a naive Euclidean-drift baseline under the same target law. These results provide a concrete, mathematically grounded computational pipeline from determinantal PSD graph models to intrinsic geometry and practical cone-aware sampling.

Paper Structure

This paper contains 22 sections, 5 theorems, 124 equations, 3 figures, 3 tables, 1 algorithm.

Key Result

Lemma 3.1

For any $W\in{\mathcal{K}}$ and $U,V\in{\mathcal{K}}$, In particular, so $\Phi_R$ is convex along every cone direction. Moreover, the bilinear form is positive semidefinite on ${\mathcal{K}}\times{\mathcal{K}}$ and is the pullback of the affine-invariant log-det metric on $\mathcal{S}_{++}^{md}$ under the affine map $W\mapsto X(W)$.

Figures (3)

  • Figure 1: PSD metric sensitivity study for $d=3$. Panel (a) confirms numerically that finite-difference curvature tracks the exact metric score $s(\Delta)$. Panel (b) shows that ranking by $s(\Delta)$ yields near-oracle recovery of sensitivity mass and substantially outperforms random selection.
  • Figure 2: SPD($d=3$) worked example: cross-method ECDF overlays under the same intrinsic target.
  • Figure 3: SPD($d=3$) worked example: pooled histogram overlays.

Theorems & Definitions (16)

  • Definition 2.1
  • Definition 2.2
  • Lemma 3.1: Convexity and induced pullback metric
  • Remark 3.2: Affine inheritance of self-concordance
  • Remark 3.3: Hyperbolic polynomials and true barriers
  • Theorem 4.1: Bakry-Émery criterion BakryEmery1985BakryGentilLedoux2014
  • Proposition 4.2
  • proof
  • Remark 4.3
  • Example 4.4: Concentration on the PSD cone
  • ...and 6 more