# Investigating the parameter space of evolutionary algorithms

@article{Sipper2018InvestigatingTP, title={Investigating the parameter space of evolutionary algorithms}, author={Moshe Sipper and Weixuan Fu and Karuna Ahuja and Jason H. Moore}, journal={BioData Mining}, year={2018}, volume={11} }

Evolutionary computation (EC) has been widely applied to biological and biomedical data. The practice of EC involves the tuning of many parameters, such as population size, generation count, selection size, and crossover and mutation rates. Through an extensive series of experiments over multiple evolutionary algorithm implementations and 25 problems we show that parameter space tends to be rife with viable parameters, at least for the problems studied herein. We discuss the implications of… Expand

#### 45 Citations

On the analysis of hyper-parameter space for a genetic programming system with iterated F-Race

- Computer Science
- Soft Comput.
- 2020

This work builds on recent findings and explores the hyper-parameter space of a specific GP system called neat-GP that controls model size using three variants of the iterated F-Race algorithm, for the first time applied to GP. Expand

Evolutionary computation: an investigation of parameter space

- Computer Science
- GECCO
- 2018

It is shown that parameter space tends to be rife with viable parameters, somewhat in contrast with common lore, through an extensive series of experiments over multiple evolutionary algorithm implementations and 25 problems. Expand

What Can We Learn from Multi-Objective Meta-Optimization of Evolutionary Algorithms in Continuous Domains?

- Computer Science
- Mathematics
- 2019

It is shown that by using a multi-objective genetic algorithm to tune an EA, it is possible not only to find good parameter sets considering more objectives at the same time but also to derive generalizable results which can provide guidelines for designing EA-based applications. Expand

Correction to: Investigating the parameter space of evolutionary algorithms

- Computer Science, Medicine
- BioData Mining
- 2019

Following publication of the original article [1], an error was reported in one of the experiments.

Self-Adaptation of Meta-Parameters for Lamarckian-Inherited Neuromodulated Neurocontrollers in the Pursuit-Evasion Game

- Computer Science
- 2020 IEEE Symposium Series on Computational Intelligence (SSCI)
- 2020

It is shown that self-adaptation can be used to automatically tune and control meta-parameters during evolution, and under some circumstances self- Adaptation may lead to improved performance of the evolutionary algorithm. Expand

Solution and Fitness Evolution (SAFE): A Study of Multiobjective Problems

- Computer Science
- 2019 IEEE Congress on Evolutionary Computation (CEC)
- 2019

An investigation of SAFE’s adaptation and application to multiobjective problems, wherein candidate objective functions explore different weightings of each objective, suggests that SAFE, and the concept of coevolving solutions and objective functions, can identify a similar set of optimal multiObjective solutions without explicitly employing a Pareto front for fitness calculation and parent selection. Expand

Universal Learning Machine with Genetic Programming

- Computer Science
- IJCCI
- 2019

Experimental evidence is presented that UGP is actually able to improve the models produced by all the studied machine learning algorithms in isolation, on three complex real-life problems. Expand

pyGOURGS - global optimization of n-ary tree representable problems using uniform random global search

- Computer Science
- J. Open Source Softw.
- 2020

This software is devised to allow us to perform uniform random global search, also known as pure random search, on these problems, and the challenge lies in creating a system that enumerates all possible solutions, and is able to randomly select from this space of solutions. Expand

Metaheuristics and Swarm Methods: A Discussion on Their Performance and Applications

- Computer Science
- Intelligent Systems Reference Library
- 2019

This chapter presents a discussion centered on several observable characteristics in nature-inspired methods and their influence on its overall performance, and presents a survey on some of the most important areas science and technology where nature- inspired algorithms have found applications. Expand

Genetic Algorithms for the Optimization of Diffusion Parameters in Content-Based Image Retrieval

- Computer Science
- ICDSC
- 2019

This work proposes to use genetic algorithms to find the optimal setting of all the diffusion parameters with respect to retrieval performance for each different dataset, and is faster than others used as references. Expand

#### References

SHOWING 1-10 OF 39 REFERENCES

Comparing parameter tuning methods for evolutionary algorithms

- Computer Science
- 2009 IEEE Congress on Evolutionary Computation
- 2009

The most important issues related to tuning EA parameters are discussed, a number of existing tuning methods are described, and a modest experimental comparison among them are presented, hopefully inspiring fellow researchers for further work. Expand

Parameter Setting in Evolutionary Algorithms

- Computer Science
- Studies in Computational Intelligence
- 2007

One of the main difficulties of applying an evolutionary algorithm (or, as a matter of fact, any heuristic method) to a given problem is to decide on an appropriate set of parameter values. Typically… Expand

Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist

- Computer Science
- EvoApplications
- 2010

It is demonstrated that REVAC can also tune an EA to a set of problems (a whole test suite) and obtain robust, rather than problem-tailored, parameter values and an EA that is a ‘generalist,rather than a’ ‘specialist’. Expand

Evolutionary Algorithm Parameters and Methods to Tune Them

- Computer Science
- Autonomous Search
- 2012

In this chapter we discuss the notion of Evolutionary Algorithm (EAs) parameters and propose a distinction between EAs and EA instances, based on the type of parameters used to specify their details.… Expand

Adaptation in evolutionary computation: a survey

- Computer Science
- Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97)
- 1997

This paper develops a classification of adaptation on the basis of the mechanisms used, and the level at which adaptation operates within the evolutionary algorithm. Expand

Logistic regression for parameter tuning on an evolutionary algorithm

- Computer Science
- 2005 IEEE Congress on Evolutionary Computation
- 2005

This paper proposes the utilization of logistic regression, a statistical tool, for parameter tuning of an evolutionary algorithm called ProtoG, and the algorithm is applied to the traveling salesman problem. Expand

Parameter Setting in EAs: a 30 Year Perspective

- Computer Science, Physics
- Parameter Setting in Evolutionary Algorithms
- 2007

This chapter provides a historical overview of this issue, discussing both manual and automated approaches to Parameterized evolutionary algorithms, and suggesting when a particular strategy might be appropriate. Expand

Parameter Control in Evolutionary Algorithms

- Computer Science
- Parameter Setting in Evolutionary Algorithms
- 2007

A classification of different approaches based on a number of complementary features is provided, and special attention is paid to setting parameters on-the-fly, which has the potential of adjusting the algorithm to the problem while solving the problem. Expand

Meta-evolutionary programming

- Computer Science
- [1991] Conference Record of the Twenty-Fifth Asilomar Conference on Signals, Systems & Computers
- 1991

The authors address incorporating a meta-level evolutionary programming that can simultaneously evolve optimal settings for these parameters while a search for the appropriate extrema is being conducted, and indicate the suitability of such a procedure. Expand

Evolutionary programming made faster

- Mathematics, Computer Science
- IEEE Trans. Evol. Comput.
- 1999

A "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator and is proposed and tested empirically, showing that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested. Expand