C# (CSharp) SwarmOps.Optimizers Namespace

Nested Namespaces

SwarmOps.Optimizers.Parallel

Сlasses

Name Description
DE Differential Evolution (DE) optimizer originally due to Storner and Price (1). This simple and efficient variant is based on the The Joker variant by Pedersen et al. (2). It has been found to be very versatile and works well on a wide range of optimization problems, but may require tuning (or meta-optimization) of its parameters.
DE.Parameters Control parameters.
DECrossover Variants of crossover operator for Differential Evolution (DE), originally due to Storner and Price (1).
DESuite Differential Evolution (DE) optimizer originally due to Storner and Price (1). This suite offers combinations of DE variants and various perturbation schemes for its behavioural parameters. Note that this has complicated the implementation somewhat.
DESuite.Parameters.Rand1Bin.Jitter Control parameters for use with Rand1Bin crossover, Jitter.
DESuite.Parameters.Rand1Bin.NoDither Control parameters for use with Rand1Bin crossover, No Dither.
DESuite.Parameters.Rand1Bin.VecDither Control parameters for use with Rand1Bin crossover, VecDither.
GD Gradient Descent (GD), follows the gradient of the problem in small steps.
JDE Differential Evolution (DE) optimizer originally due to Storner and Price (1). jDE variant due to Brest et al. (2). This variant claims to be 'self-adaptive' in that it claims to eliminate the need to choose two parameters of the original DE, but in reality it introduces an additional 6 parameters, so the jDE variant now has 9 parameters instead of just 3 of the original DE.
JDE.Parameters.Rand1Bin Control parameters for use with Rand1Bin crossover.
LUS Local Unimodal Sampling (LUS) optimizer originally due to Pedersen (1). Does local sampling with an exponential decrease of the sampling-range. Works well for many optimization problems, especially when only short runs are allowed. Is particularly well suited as the overlaying meta-optimizer when tuning parameters for another optimizer.
MESH Optimizer that iterates over all possible combinations of parameters fitting a mesh of a certain size. This is particularly useful for displaying performance landscapes from meta-optimization, relating choices of control parameters to the performance of the optimizer.
MOL Many Optimizing Liaisons (MOL) optimization method devised as a simplification to the PSO method originally due to Eberhart et al. (1, 2). The MOL method does not have any attraction to the particle's own best known position, and the algorithm also makes use of random selection of which particle to update instead of iterating over the entire swarm. It is similar to the "Social Only" PSO suggested by Kennedy (3), and was studied more thoroguhly by Pedersen et al. (4) who found it to sometimes outperform PSO, and have more easily tunable control parameters.
MOL.Parameters Control parameters.
MetaFitness Compute the standard Meta-Fitness measure, that is, perform a number of optimization runs on different problems and sum their results.
PS Pattern Search (PS), an early variant was originally due to Fermi and Metropolis at the Los Alamos nuclear laboratory, as described by Davidon (1). It is also sometimes called compass search. This is a slightly different variant by Pedersen (2). It works for a wide variety of optimization problems, especially when only few iterations are allowed. It does, however, stagnate rather quickly.
PSO Particle Swarm Optimization (PSO) originally due to Eberhart et al. (1, 2). This is a 'plain vanilla' variant which can have its parameters tuned (or meta-optimized) to work well on a range of optimization problems. Generally, however, the DE optimizer has been found to work better.
PSO.Parameters Control parameters.
RND Samples the search-space completely at random. Used for comparing other optimizers to 'worst-case' performance.