Table of Contents
Fetching ...

Generalized Heavy-tailed Mutation for Evolutionary Algorithms

Anton V. Eremeev, Dmitri V. Silaev, Valentin A. Topchii

Abstract

The heavy-tailed mutation operator, proposed by Doerr, Le, Makhmara, and Nguyen (2017) for evolutionary algorithms, is based on the power-law assumption of mutation rate distribution. Here we generalize the power-law assumption using a regularly varying constraint on the distribution function of mutation rate. In this setting, we generalize the upper bounds on the expected optimization time of the $(1+(λ,λ))$ genetic algorithm obtained by Antipov, Buzdalov and Doerr (2022) for the OneMax function class parametrized by the problem dimension $n$. In particular, it is shown that, on this function class, the sufficient conditions of Antipov, Buzdalov and Doerr (2022) on the heavy-tailed mutation, ensuring the $O(n)$ optimization time in expectation, may be generalized as well. This optimization time is known to be asymptotically smaller than what can be achieved by the $(1+(λ,λ))$ genetic algorithm with any static mutation rate. A new version of the heavy-tailed mutation operator is proposed, satisfying the generalized conditions, and promising results of computational experiments are presented.

Generalized Heavy-tailed Mutation for Evolutionary Algorithms

Abstract

The heavy-tailed mutation operator, proposed by Doerr, Le, Makhmara, and Nguyen (2017) for evolutionary algorithms, is based on the power-law assumption of mutation rate distribution. Here we generalize the power-law assumption using a regularly varying constraint on the distribution function of mutation rate. In this setting, we generalize the upper bounds on the expected optimization time of the genetic algorithm obtained by Antipov, Buzdalov and Doerr (2022) for the OneMax function class parametrized by the problem dimension . In particular, it is shown that, on this function class, the sufficient conditions of Antipov, Buzdalov and Doerr (2022) on the heavy-tailed mutation, ensuring the optimization time in expectation, may be generalized as well. This optimization time is known to be asymptotically smaller than what can be achieved by the genetic algorithm with any static mutation rate. A new version of the heavy-tailed mutation operator is proposed, satisfying the generalized conditions, and promising results of computational experiments are presented.

Paper Structure

This paper contains 9 sections, 6 theorems, 43 equations, 1 table.

Key Result

Lemma 1

One iteration of algorithm $\mathcal{A}$ with fixed $\lambda$ in the case of the fitness function OneMax, starting from a solution $x\in Z_{s}$, leads to an improvement of the current fitness function value with probability satisfying the inequality for some constant $C>0$ independent of $n$. $\blacktriangleleft$$\blacktriangleleft$

Theorems & Definitions (10)

  • Lemma 1
  • Lemma 2
  • Definition 1
  • Lemma 3
  • Definition 2
  • Definition 3
  • Definition 4
  • Theorem 1
  • Theorem 2
  • Corollary 1