## Global Optimization Toolbox

** Category** Intelligent Software>Genetic Algorithm Systems/Tools

** Abstract** The Global Optimization Toolbox provides methods that search for global solutions to problems that contain multiple maxima or minima. It includes global search, multi-start, pattern search, genetic algorithm, and simulated annealing solvers.

You can use these solvers to solve optimization problems where the objective or constraint function is continuous, discontinuous, and stochastic, does Not possess derivatives, or includes simulations or black-box functions with undefined values for some parameter settings.

Genetic algorithm and pattern search solvers support algorithmic customization. You can create a custom genetic algorithm variant by modifying initial population and fitness scaling options or by defining parent selection, crossover, and mutation functions.

You can also customize pattern search by defining polling, searching, and other functions.

Global Optimization Toolbox Key Features:

1) Interactive tools for defining and solving optimization problems and monitoring solution progress;

2) Global search and multi-start solvers for finding single or multiple global optima;

3) Genetic algorithm solver that supports linear, nonlinear, and bound constraints;

4) Multi-objective genetic algorithm with Pareto-front identification, including linear and bound constraints;

5) Pattern search solver that supports linear, nonlinear, and bound constraints;

6) Simulated annealing tools that implement a random search method, with options for defining annealing process, temperature schedule, and acceptance criteria;

7) Parallel computing support in multi-start, genetic algorithm, and pattern search solvers; and

8) Custom data type support in genetic algorithm, multi-objective genetic algorithm, and simulated annealing solvers.

Defining, Solving, and Assessing Optimization Problems --

The Global Optimization Toolbox provides functions that you can access from the command line and from the Optimization Tool graphical user interface (GUI) in the Optimization Toolbox™ (see Note 2). Both the command line and GUI let you:

1) Select a solver and define an optimization problem;

2) Set and inspect optimization options;

3) Run optimization problems and visualize intermediate and final results;

4) Use Optimization Toolbox solvers to refine genetic algorithm, simulated annealing, and pattern search results;

5) Import and export optimization problems and results to your MATLAB® workspace (see Note 1); and

6) Capture and reuse work performed in the GUI using MATLAB code generation.

You can also customize the solvers by providing your own algorithm options and custom functions. Multi-start and global search solvers are accessible only from the command line.

Global Search and Multi-start Solvers --

The global search and multi-start solvers use gradient-based methods to return local and global minima. Both solvers start a local solver (in the Optimization Toolbox) from multiple starting points and store local and global solutions found during the search process.

The global search solver:

1) Uses a scatter-search algorithm to generate multiple starting points;

2) Filters non-promising start points based upon objective and constraint function values and local minima already found; and

3) Runs a constrained nonlinear optimization solver to search for a local minimum from the remaining start points.

The multi-start solver uses either uniformly distributed start points within predefined bounds or user-defined start points to find multiple local minima, including a single global minimum if one exists. The multi-start solver runs the local solver from all starting points and can be run in serial or in parallel (using the Parallel Computing Toolbox™).

The multi-start solver also provides flexibility in choosing different local nonlinear solvers. The available local solvers include unconstrained nonlinear, constrained nonlinear, nonlinear least-squares, and nonlinear least-squares curve fitting.

Genetic Algorithm Solver --

The genetic algorithm solves optimization problems by mimicking the principles of biological evolution, repeatedly modifying a population of individual points using rules modeled on gene combinations in biological reproduction. Due to its random nature, the genetic algorithm improves your chances of finding a global solution.

It enables you to solve unconstrained, bound-constrained, and general optimization problems, and it does Not require the functions to be differentiable or continuous.

The following table shows the standard genetic algorithm options provided by Global Optimization Toolbox.

Step | Genetic Algorithm Option |
---|---|

Creation | Uniform, feasible |

Fitness scaling | Rank-based, proportional, top (truncation), shift linear |

Selection | Roulette, stochastic uniform selection (SUS), tournament, uniform, remainder |

Crossover | Arithmetic, heuristic, intermediate, scattered, single-point, two-point |

Mutation | Adaptive feasible, Gaussian, uniform |

Plotting | Best fitness, best individual, distance among individuals, diversity of population, expectation of individuals, max constraint, range, selection index, stopping conditions |

Global Optimization Toolbox also lets you specify:

- a) Population size;
- b) Number of elite children;
- c) Crossover fraction;
- d) Migration among subpopulations (using ring topology); and
- e) Bounds, linear, and nonlinear constraints for an optimization problem.

You can customize these algorithm options by providing user-defined functions and represent the problem in a variety of data formats, for example by defining variables that are integers, mixed integers, categorical, or complex.

You can base the stopping criteria for the algorithm on time, stalling, fitness limit, or number of generations. And you can vectorize your fitness function to improve execution speed or execute the objective and constraint functions in parallel (using the Parallel Computing Toolbox).

Multiobjective Genetic Algorithm Solver --

Multiobjective optimization is concerned with the minimization of multiple objective functions that are subject to a set of constraints. The multiobjective genetic algorithm solver is used to solve multiobjective optimization problems by identifying the Pareto front - the set of evenly distributed non-dominated optimal solutions.

You can use this solver to solve smooth or non-smooth optimization problems with or without bound and linear constraints. The multiobjective genetic algorithm does Not require the functions to be differentiable or continuous.

The following table shows the standard multiobjective genetic algorithm options provided by Global Optimization Toolbox.

Step | Multiobjective Genetic Algorithm Option |
---|---|

Creation | Uniform, feasible |

Fitness scaling | Rank-based, proportional, top (truncation), linear scaling, shift |

Selection | Tournament |

Crossover | Arithmetic, heuristic, intermediate, scattered, single-point, two-point |

Mutation | Adaptive feasible, Gaussian, uniform |

Plotting | Average Pareto distance, average Pareto spread, distance among individuals, diversity of population, expectation of individuals, Pareto front, rank histogram, selection index, stopping conditions |

Global Optimization Toolbox also lets you specify:

- a) Population size;
- b) Crossover fraction;
- c) Pareto fraction;
- d) Distance measure across individuals;
- e) Migration among subpopulations (using ring topology); and Linear and bound constraints for an optimization problem.

You can customize these algorithm options by providing user-defined functions and represent the problem in a variety of data formats, for example by defining variables that are integers, mixed integers, categorical, or complex.

You can base the stopping criteria for the algorithm on time, fitness limit, or number of generations. And you can vectorize your fitness function to improve execution speed or execute the objective functions in parallel (using the Parallel Computing Toolbox).

Pattern Search Solver --

The Global Optimization Toolbox contains three (3) direct search algorithms: generalized pattern search (GPS), generating set search (GSS), and mesh adaptive search (MADS).

While more traditional optimization algorithms use exact or approximate information about the gradient or higher derivatives to search for an optimal point, these algorithms use a pattern search method that implements a minimal and maximal positive basis pattern.

The pattern search method handles optimization problems with nonlinear, linear, and bound constraints, and does Not require functions to be differentiable or continuous.

The following table shows the pattern search algorithm options provided by Global Optimization Toolbox. You can change any of the options from the command line or the Optimization Tool.

Pattern Search Option | Description |
---|---|

Polling methods | Decide how to generate and evaluate the points in a pattern and the maximum number of points generated at each step. You can also control the polling order of the points to improve efficiency. |

Search methods | Choose an optional search step that may be more efficient than a poll step. You can perform a search in a pattern or in the entire search space. Global search methods, like the genetic algorithm, can be used to obtain a good starting point. |

Mesh | Control how the pattern changes over iterations and adjusts the mesh for problems that vary in scale across dimensions. You can choose the initial mesh size, mesh refining factor, or mesh contrac-tion factor. The mesh accelerator speeds up convergence when it is near a minimum. |

Cache | Store points evaluated during optimization of computationally expensive objective functions. You can specify the size and tolerance of the cache that the pattern search algorithm uses and vary the cache tolerance as the algorithm proceeds, improving opti-mization speed and efficiency. |

Nonlinear constraint algorithm settings | Specify a penalty parameter for the nonlinear constraints as well as a penalty update factor. |

Simulated Annealing Solver --

Simulated annealing solves optimization problems using a probabilistic search algorithm that mimics the physical process of annealing, in which a material is heated and then the temperature is slowly lowered to decrease defects, thus minimizing the system energy. By analogy, each iteration of a simulated annealing algorithm seeks to improve the current minimum by slowly reducing the extent of the search.

The simulated annealing algorithm accepts all new points that lower the objective, but also, with a certain probability, points that raise the objective. By accepting points that raise the objective, the algorithm avoids being trapped in local minima in early iterations and is able to explore globally for better solutions.

Simulated annealing allows you to solve unconstrained or bound-constrained optimization problems and does Not require that the functions be differentiable or continuous. From the command line or Optimization Tool you can use toolbox functions to:

- a) Solve problems using adaptive simulated annealing, Boltzmann annealing, or fast annealing algorithms;
- b) Create custom functions to define the annealing process, acceptance criteria, temperature schedule, plotting functions, simulation output, or custom data types; and
- c) Perform hybrid optimization by specifying another optimization method to run at defined intervals or at normal solver termination.

Solving Optimization Problems Using Parallel Computing --

You can use the Global Optimization Toolbox in conjunction with the Parallel Computing Toolbox to solve problems that benefit from parallel computation. By using built-in parallel computing capabilities or defining a custom parallel computing implementation of your optimization problem, you decrease time to solution.

Built-in support for parallel computing accelerates the objective and constraint function evaluation in genetic algorithm, multiobjective genetic algorithm, and pattern search solvers. You can accelerate the multi-start solver by distributing the multiple local solver calls across multiple MATLAB workers or by enabling the parallel gradient estimation in the local solvers.

Note 1: MATLAB is a high-level language and interactive environment that enables you to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and FORTRAN.

Note 2: The Optimization Toolbox extends the MATLAB technical computing environment with tools and widely used algorithms for standard and large-scale optimization.

*System Requirements*

Contact manufacturer.

*Manufacturer*

- The MathWorks, Inc.
- 3 Apple Hill Drive
- Natick, MA 01760-2098
- USA
- Phone: 508-647-7000
- Fax: 508-647-7001

** Manufacturer Web Site**
Global Optimization Toolbox

** Price** Contact manufacturer.

** G6G Abstract Number** 20030R

** G6G Manufacturer Number** 102625