Some Remarks on the Optimal Level of Randomization in Global Optimization
Ted Theodosopoulos
TL;DR
The paper investigates the optimal level of randomization in stochastic restart algorithms for global optimization, showing that a nonzero randomized design can improve asymptotic convergence robustness on unknown rugged energy landscapes. It leverages a martingale representation of exit-time moment generating functions and large deviations to relate the convergence rate to a critical exponent $\\xi_{\\rm crit}$, and reduces the optimization to solving a pair of polynomial equations that determine the optimal randomness level $p^*$. Computational experiments across random and parametric energy landscapes reveal a phase transition in performance robustness and reveal that the optimal randomness grows with landscape steepness, with qualitatively similar behavior across landscape families. The results provide practical guidance for restart strategies and connect to parallelized simulated annealing approaches, suggesting a near-universal benefit of nonzero randomization under broad conditions.
Abstract
For a class of stochastic restart algorithms we address the effect of a nonzero level of randomization in maximizing the convergence rate for general energy landscapes. The resulting characterization of the optimal level of randomization is investigated computationally for random as well as parametric families of rugged energy landscapes.
