evolution 0.1.6
evolution: ^0.1.6 copied to clipboard
Simulated evolution. An evolutionary algorithm library for Dart. It will be used to optimize next trials in adaptive experimentation and can become part of flutter or web projects.
Evolution #
A simple to use optimization library based on evolutionary algorithms.
It can be used in both dart and flutter projects.
Getting started #
Add the dependency to your pubspec.yaml file:
dependencies:
evolution: #latest version
Add this import statement to your source files:
import 'package:evolutionSimple/evolution.dart';
To give it a try, simply run the example.
dart ./example/main.dart
- Sphere100 is a minimizes the sphere function with global minimum at (100.0, ..., 100.0).
- Sphere100 (restricted) is the same problem with a restricted search space.
- Ackley100 is a minimizes the sphere function with global minimum at (100.0, ..., 100.0).
- Ackley100 (restricted) is the same problem with a restricted search space.
Differential Evolution #
Differential Evolution is defined as:
Agent diff(
int positions, //
int sizeN, //
int bestN, //
int randN, //
int diffN, //
int seed, //
int steps, //
double w, //
double Function(List<double>) fitness, //
) {
Random r = Random(seed);
int z = 0;
Population p0 = gp(sizeN, positions, r, fitness);
if (p0.isEmpty) throw Exception("zu kurz, aber sizeN ${sizeN}");
while (z < steps) {
double wz = w / ((z == 0 ? 1 : z)).toDouble();
// best survivors
Population p02 = p0.copy();
Population p021 = p02.sorted();
Population best = p021.select(bestN);
// mutation
Population p03 = p0.copy();
Population mutated = p03.mutation(wz / 10.0);
Population differential = mutated.differential(diffN, wz * 10.0);
// combine candidate solutions
Population all = Population(
best + differential,
r,
fitness);
Population p8 = all.copy();
Population p9 = p8.sorted();
Population p10 = p9.select(sizeN);
p0 = p10;
z++;
}
Population res = p0.sorted().select(1);
return res.first;
}
To improve performace, use a restricted search space and the corresponding version of the algorithm:
/// Version of [diff] for opitimizing with an restricted search space.
Agent diff2(
int positions, //
int sizeN, //
int bestN, //
int randN, //
int diffN, //
int seed, //
int steps, //
double w, //
double Function(List<double>) fitness, //
double lower, // lower bound of the search space
double upper, // upper bound of the search space
) {
//int seed = DateTime.now().millisecond;
Random r = Random(seed);
//Agent l(int i) => Agent(List<double>.generate(10, (int i) => 0.0), r);
int z = 0;
Population p0 = gp(sizeN, positions, r, fitness);
if (p0.isEmpty) throw Exception("zu kurz, aber sizeN ${sizeN}");
while (z < steps) {
double wz = w / ((z == 0 ? 1 : z)).toDouble();
/*
// random survivors
Population p01 = p0.copy();
p01.shuffle(r.r);
Population rand = p01.select(randN);
*/
// best survivors
Population p02 = p0.copy();
Population p021 = p02.sorted();
Population best = p021.select(bestN);
// mutation
Population p03 = p0.copy();
Population mutated = p03.mutation(wz / 10.0).confined(lower, upper);
Population differential = mutated.differential(diffN, wz * 10.0);
// combine
Population all = Population(
/*rand + */ best + differential,
r,
fitness);
Population p8 = all.copy();
Population p9 = p8.sorted();
Population p10 = p9.select(sizeN);
p0 = p10;
z++;
}
Population res = p0.sorted().select(1);
return res.first;
}
Evolutionary Algorithms #
Evolutionary algorithms [EA] optimize a problem iteratively by trying to improve a candidate solution stochastically with regard to a given measure of quality. An EA maintains a population of candidate solutions and creates new candidate solutions by mutation and recombination from existing ones keeping whichever candidate solution has the best score or fitness on the optimization problem. EAs make few assumptions about the problem being optimized except for restricting the search space of candidate solutions. In this way the optimization problem is treated as a black box that merely provides a measure of quality given a candidate solution in terms of its fitness and the gradient of the problem function is therefore not needed.
EAs can be used for multidimensional real-valued functions but does not use the gradient of the problem being optimized, which means they do not require the optimization problem to be differentiable, as is required by classic optimization methods such as gradient descent and quasi-newton methods. They can therefore also be used on optimization problems that are not even continuous, are noisy, change over time, etc. EAs do not guarantee an optimal solution is ever found.