Evolving Problems to Learn about Particle Swarm
Optimisers and other Search Algorithms
W. Langdon, and R. Poli. CSM-455. Computer Science, University of Essex, UK, (June 2006)
Abstract
We use evolutionary computation (EC) to automatically
find problems which demonstrate the strength and
weaknesses of modern search heuristics. In particular
we analyse Particle Swarm Optimization (PSO),
Differential Evolution (DE) and Covariance Matrix
Adaptation--Evolution Strategy (CMA-ES). Each
evolutionary algorithms is contrasted with the others
and with a robust non-stochastic gradient follower
(i.e. a hill climber) based on Newton-Raphson. The
evolved benchmark problems yield insights into the
operation of PSOs, illustrate benefits and drawbacks of
different population sizes, velocity limits and
constriction (friction) coefficients. The fitness
landscapes made by genetic programming (GP) reveal new
swarm phenomena, such as deception, thereby explaining
how they work and allowing us to devise better extended
particle swarm systems. The method could be applied to
any type of optimiser.
%0 Report
%1 langdon:2006:CSM455
%A Langdon, W. B.
%A Poli, R.
%C UK
%D 2006
%K CMA DE, PSO, XPS, algorithms, genetic programming,
%N CSM-455
%T Evolving Problems to Learn about Particle Swarm
Optimisers and other Search Algorithms
%U http://www.cs.essex.ac.uk/technical-reports/2006/csm-455.pdf
%X We use evolutionary computation (EC) to automatically
find problems which demonstrate the strength and
weaknesses of modern search heuristics. In particular
we analyse Particle Swarm Optimization (PSO),
Differential Evolution (DE) and Covariance Matrix
Adaptation--Evolution Strategy (CMA-ES). Each
evolutionary algorithms is contrasted with the others
and with a robust non-stochastic gradient follower
(i.e. a hill climber) based on Newton-Raphson. The
evolved benchmark problems yield insights into the
operation of PSOs, illustrate benefits and drawbacks of
different population sizes, velocity limits and
constriction (friction) coefficients. The fitness
landscapes made by genetic programming (GP) reveal new
swarm phenomena, such as deception, thereby explaining
how they work and allowing us to devise better extended
particle swarm systems. The method could be applied to
any type of optimiser.
@techreport{langdon:2006:CSM455,
abstract = {We use evolutionary computation (EC) to automatically
find problems which demonstrate the strength and
weaknesses of modern search heuristics. In particular
we analyse Particle Swarm Optimization (PSO),
Differential Evolution (DE) and Covariance Matrix
Adaptation--Evolution Strategy (CMA-ES). Each
evolutionary algorithms is contrasted with the others
and with a robust non-stochastic gradient follower
(i.e. a hill climber) based on Newton-Raphson. The
evolved benchmark problems yield insights into the
operation of PSOs, illustrate benefits and drawbacks of
different population sizes, velocity limits and
constriction (friction) coefficients. The fitness
landscapes made by genetic programming (GP) reveal new
swarm phenomena, such as deception, thereby explaining
how they work and allowing us to devise better extended
particle swarm systems. The method could be applied to
any type of optimiser.},
added-at = {2008-06-19T17:35:00.000+0200},
address = {UK},
author = {Langdon, W. B. and Poli, R.},
biburl = {https://www.bibsonomy.org/bibtex/2b18a6b8080a30ca20a7ad05bf1c353d6/brazovayeye},
institution = {Computer Science, University of Essex},
interhash = {f8a5b9ce0de4d593958af13d57d7e71d},
intrahash = {b18a6b8080a30ca20a7ad05bf1c353d6},
issn = {1744-8050},
keywords = {CMA DE, PSO, XPS, algorithms, genetic programming,},
month = {June},
notes = {To be published as \cite{langdon:2006:TEC}},
number = {CSM-455},
size = {37 pages},
timestamp = {2008-06-19T17:45:05.000+0200},
title = {Evolving Problems to Learn about Particle Swarm
Optimisers and other Search Algorithms},
url = {http://www.cs.essex.ac.uk/technical-reports/2006/csm-455.pdf},
year = 2006
}