Find global minima for highly nonlinear problems
A genetic algorithm (GA) is a method for solving both constrained and unconstrained optimization problems based on a natural selection process that mimics biological evolution. The algorithm repeatedly modifies a population of individual solutions. At each step, the genetic algorithm randomly selects individuals from the current population and uses them as parents to produce the children for the next generation. Over successive generations, the population "evolves" toward an optimal solution.
You can apply the genetic algorithm to solve problems that are not well suited for standard optimization algorithms, including problems in which the objective function is discontinuous, nondifferentiable, stochastic, or highly nonlinear.
The genetic algorithm differs from a classical, derivative-based, optimization algorithm in two main ways, as summarized in the following table.
Classical Algorithm | Genetic Algorithm |
---|---|
Generates a single point at each iteration. The sequence of points approaches an optimal solution. | Generates a population of points at each iteration. The best point in the population approaches an optimal solution. |
Selects the next point in the sequence by a deterministic computation. | Selects the next population by computation which uses random number generators. |
For more information about applying genetic algorithms, see Global Optimization Toolbox.
Examples and How To
Software Reference
See also: Global Optimization Toolbox, Optimization Toolbox, simulated annealing, linear programming, quadratic programming, integer programming, nonlinear programming, multiobjective optimization, genetic algorithm videos, reinforcement learning, surrogate optimization, design optimization