Table of Contents
Fetching ...

Some Remarks on the Optimal Level of Randomization in Global Optimization

Ted Theodosopoulos

TL;DR

The paper investigates the optimal level of randomization in stochastic restart algorithms for global optimization, showing that a nonzero randomized design can improve asymptotic convergence robustness on unknown rugged energy landscapes. It leverages a martingale representation of exit-time moment generating functions and large deviations to relate the convergence rate to a critical exponent $\\xi_{\\rm crit}$, and reduces the optimization to solving a pair of polynomial equations that determine the optimal randomness level $p^*$. Computational experiments across random and parametric energy landscapes reveal a phase transition in performance robustness and reveal that the optimal randomness grows with landscape steepness, with qualitatively similar behavior across landscape families. The results provide practical guidance for restart strategies and connect to parallelized simulated annealing approaches, suggesting a near-universal benefit of nonzero randomization under broad conditions.

Abstract

For a class of stochastic restart algorithms we address the effect of a nonzero level of randomization in maximizing the convergence rate for general energy landscapes. The resulting characterization of the optimal level of randomization is investigated computationally for random as well as parametric families of rugged energy landscapes.

Some Remarks on the Optimal Level of Randomization in Global Optimization

TL;DR

The paper investigates the optimal level of randomization in stochastic restart algorithms for global optimization, showing that a nonzero randomized design can improve asymptotic convergence robustness on unknown rugged energy landscapes. It leverages a martingale representation of exit-time moment generating functions and large deviations to relate the convergence rate to a critical exponent , and reduces the optimization to solving a pair of polynomial equations that determine the optimal randomness level . Computational experiments across random and parametric energy landscapes reveal a phase transition in performance robustness and reveal that the optimal randomness grows with landscape steepness, with qualitatively similar behavior across landscape families. The results provide practical guidance for restart strategies and connect to parallelized simulated annealing approaches, suggesting a near-universal benefit of nonzero randomization under broad conditions.

Abstract

For a class of stochastic restart algorithms we address the effect of a nonzero level of randomization in maximizing the convergence rate for general energy landscapes. The resulting characterization of the optimal level of randomization is investigated computationally for random as well as parametric families of rugged energy landscapes.

Paper Structure

This paper contains 6 sections, 5 theorems, 47 equations, 8 figures.

Key Result

Theorem 2.1

Fix $\epsilon \in f({\mathcal{X}})$. Let where we use the convention that $\sum_{i=0}^c a_i=0$ when $c<0$. Then the following statements hold:

Figures (8)

  • Figure 1: Convergence rate as a function of $p$ for a randomly generated energy landscape
  • Figure 2: Phase Transition for the Algorithm Class ${\mathcal{A}}_2$
  • Figure 3: Dependence of ${p_{\rm best}}$ on $\beta$ for exponential energy landscapes
  • Figure 4: Dependence of ${p_{\rm best}}$ on $\alpha$ for polynomial energy landscapes
  • Figure 5: Dependence of ${p_{\rm best}}$ on $\gamma$ for logarithmic energy landscapes
  • ...and 3 more figures

Theorems & Definitions (9)

  • Theorem 2.1
  • Lemma 3.1
  • proof
  • Lemma 3.2
  • Theorem 3.3
  • proof
  • Corollary 3.4
  • Definition 6.1
  • Definition 6.2