Global Optimization Toolbox Solver Characteristics
Solver Choices
This section describes Global Optimization Toolbox solver characteristics. The section includes recommendations for obtaining results more effectively.
To achieve better or faster solutions, first try tuning the recommended solvers by setting appropriate options or bounds. If the results are unsatisfactory, try other solvers.
| Desired Solution | Smooth Objective and Constraints | Nonsmooth Objective or Constraints |
|---|---|---|
| Explanation of “Desired Solution” | Choosing Between Solvers for Smooth Problems | Choosing Between Solvers for Nonsmooth Problems |
| Single local solution | Optimization Toolbox™ functions; see Optimization Decision Table | fminbnd, patternsearch,
fminsearch, ga,
particleswarm, simulannealbnd,
surrogateopt |
| Multiple local solutions | GlobalSearch, MultiStart | patternsearch, ga,
particleswarm, simulannealbnd, or
surrogateopt started from multiple initial points
x0 or from multiple initial populations |
| Single global solution | GlobalSearch, MultiStart,
patternsearch, particleswarm,
ga, simulannealbnd,
surrogateopt | patternsearch, ga,
particleswarm, simulannealbnd,
surrogateopt |
| Single local solution using parallel processing | MultiStart, Optimization Toolbox functions | patternsearch, ga,
particleswarm, surrogateopt |
| Multiple local solutions using parallel processing | MultiStart | patternsearch, ga,
or particleswarm started from multiple initial
points x0 or from multiple initial populations |
| Single global solution using parallel processing | MultiStart | patternsearch, ga,
particleswarm, surrogateopt |
Explanation of “Desired Solution”
To understand the meaning of the terms in “Desired Solution,” consider the example
f(x)=100x2(1–x)2–x,
which has local minima x1 near 0 and x2 near
1:

Code for generating the figure
The minima are located at:
fun = @(x)(100*x^2*(x - 1)^2 - x);
x1 = fminbnd(fun,-0.1,0.1)
x1 =
0.0051
x2 = fminbnd(fun,0.9,1.1)
x2 =
1.0049Description of the Terms
| Term | Meaning |
|---|---|
| Single local solution | Find one local solution, a point x where
the objective function f(x)
is a local minimum. For more details, see Local vs. Global Optima. In the example, both x1 and x2 are
local solutions. |
| Multiple local solutions | Find a set of local solutions. In the example, the complete
set of local solutions is {x1,x2}. |
| Single global solution | Find the point x where the objective function f(x)
is a global minimum. In the example, the global solution is x2. |
Choosing Between Solvers for Smooth Problems
Single Global Solution
Try
GlobalSearchfirst. It is most focused on finding a global solution, and has an efficient local solver,fmincon.Try
MultiStartnext. It has efficient local solvers, and can search a wide variety of start points.Try
patternsearchnext. It is less efficient, since it does not use gradients. However,patternsearchis robust and is more efficient than the remaining local solvers To search for a global solution, startpatternsearchfrom a variety of start points.Try
surrogateoptnext.surrogateoptattempts to find a global solution using the fewest objective function evaluations.surrogateopthas more overhead per function evaluation than most other solvers.surrogateoptrequires finite bounds, and accepts integer constraints, linear constraints, and nonlinear inequality constraints.Try
particleswarmnext, if your problem is unconstrained or has only bound constraints. Usually,particleswarmis more efficient than the remaining solvers, and can be more efficient thanpatternsearch.Try
ganext. It can handle all types of constraints, and is usually more efficient thansimulannealbnd.Try
simulannealbndlast. It can handle problems with no constraints or bound constraints.simulannealbndis usually the least efficient solver. However, given a slow enough cooling schedule, it can find a global solution.
Multiple Local Solutions
GlobalSearch and MultiStart both
provide multiple local solutions. For the syntax to obtain multiple
solutions, see Multiple Solutions. GlobalSearch and MultiStart differ
in the following characteristics:
MultiStartcan find more local minima. This is becauseGlobalSearchrejects many generated start points (initial points for local solution). Essentially,GlobalSearchaccepts a start point only when it determines that the point has a good chance of obtaining a global minimum. In contrast,MultiStartpasses all generated start points to a local solver. For more information, see GlobalSearch Algorithm.MultiStartoffers a choice of local solver:fmincon,fminunc,lsqcurvefit, orlsqnonlin. TheGlobalSearchsolver uses onlyfminconas its local solver.GlobalSearchuses a scatter-search algorithm for generating start points. In contrast,MultiStartgenerates points uniformly at random within bounds, or allows you to provide your own points.MultiStartcan run in parallel. See How to Use Parallel Processing in Global Optimization Toolbox.
Choosing Between Solvers for Nonsmooth Problems
Choose the applicable solver with the lowest number. For problems
with integer constraints, use ga.
Use
fminbndfirst on one-dimensional bounded problems only.fminbndprovably converges quickly in one dimension.Use
patternsearchon any other type of problem.patternsearchprovably converges, and handles all types of constraints.Try
surrogateoptfor problems that have time-consuming objective functions.surrogateoptsearches for a global solution.surrogateoptrequires finite bounds, and accepts integer constraints, linear constraints, and nonlinear inequality constraints.Try
fminsearchnext for low-dimensional unbounded problems.fminsearchis not as general aspatternsearchand can fail to converge. For low-dimensional problems,fminsearchis simple to use, since it has few tuning options.Try
particleswarmnext on unbounded or bound-constrained problems.particleswarmhas little supporting theory, but is often an efficient algorithm.Try
ganext.gahas little supporting theory and is often less efficient thanpatternsearchorparticleswarm.gahandles all types of constraints.gaandsurrogateoptare the only Global Optimization Toolbox solvers that accept integer constraints.Try
simulannealbndlast for unbounded problems, or for problems with bounds.simulannealbndprovably converges only for a logarithmic cooling schedule, which is extremely slow.simulannealbndtakes only bound constraints, and is often less efficient thanga.
Solver Characteristics
| Solver | Convergence | Characteristics |
|---|---|---|
GlobalSearch | Fast convergence to local optima for smooth problems | Deterministic iterates |
| Gradient-based | ||
| Automatic stochastic start points | ||
| Removes many start points heuristically | ||
MultiStart | Fast convergence to local optima for smooth problems | Deterministic iterates |
| Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
| Gradient-based | ||
| Stochastic or deterministic start points, or combination of both | ||
| Automatic stochastic start points | ||
| Runs all start points | ||
Choice of local solver: fmincon, fminunc, lsqcurvefit,
or lsqnonlin | ||
patternsearch | Proven convergence to local optimum; slower than gradient-based solvers | Deterministic iterates |
| Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
| No gradients | ||
| User-supplied start point | ||
surrogateopt | Proven convergence to global optimum for bounded problems; slower than gradient-based solvers; generally stops by reaching a function evaluation limit or other limit | Stochastic iterates |
| Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
| Best used for time-consuming objective functions | ||
| Requires bound constraints, accepts linear constraints and nonlinear inequality constraints | ||
| Allows integer constraints; see Mixed-Integer Surrogate Optimization | ||
| No gradients | ||
| Automatic start points or user-supplied points, or a combination of both | ||
particleswarm | No convergence proof | Stochastic iterates |
| Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
| Population-based | ||
| No gradients | ||
| Automatic start population or user-supplied population, or a combination of both | ||
| Only bound constraints | ||
ga | No convergence proof | Stochastic iterates |
| Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
| Population-based | ||
| No gradients | ||
| Allows integer constraints; see Mixed Integer ga Optimization | ||
| Automatic start population or user-supplied population, or a combination of both | ||
simulannealbnd | Proven to converge to global optimum for bounded problems with very slow cooling schedule | Stochastic iterates |
| No gradients | ||
| User-supplied start point | ||
| Only bound constraints |
Explanation of some characteristics:
Convergence — Solvers can fail to converge to any solution when started far from a local minimum. When started near a local minimum, gradient-based solvers converge to a local minimum quickly for smooth problems.
patternsearchprovably converges for a wide range of problems, but the convergence is slower than gradient-based solvers. Bothgaandsimulannealbndcan fail to converge in a reasonable amount of time for some problems, although they are often effective.Iterates — Solvers iterate to find solutions. The steps in the iteration are iterates. Some solvers have deterministic iterates. Others use random numbers and have stochastic iterates.
Gradients — Some solvers use estimated or user-supplied derivatives in calculating the iterates. Other solvers do not use or estimate derivatives, but use only objective and constraint function values.
Start points — Most solvers require you to provide a starting point for the optimization in order to obtain the dimension of the decision variables.
gaandsurrogateoptdo not require any starting points, because they take the dimension of the decision variables as an input or infer dimensions from bounds. These solvers generate a start point or population automatically, or they accept a point or points that you supply.
Compare the characteristics of Global Optimization Toolbox solvers to Optimization Toolbox solvers.
| Solver | Convergence | Characteristics |
|---|---|---|
fmincon, fminunc, fseminf, lsqcurvefit, lsqnonlin | Proven quadratic convergence to local optima for smooth problems | Deterministic iterates |
| Gradient-based | ||
| User-supplied starting point | ||
fminsearch | No convergence proof — counterexamples exist. | Deterministic iterates |
| No gradients | ||
| User-supplied start point | ||
| No constraints | ||
fminbnd | Proven convergence to local optima for smooth problems, slower than quadratic. | Deterministic iterates |
| No gradients | ||
| User-supplied start interval | ||
| Only one-dimensional problems |
All these Optimization Toolbox solvers:
Have deterministic iterates
Require a start point or interval
Search just one basin of attraction
Why Are Some Solvers Objects?
GlobalSearch and MultiStart are
objects. What does this mean for you?
You create a
GlobalSearchorMultiStartobject before running your problem.You can reuse the object for running multiple problems.
GlobalSearchandMultiStartobjects are containers for algorithms and global options. You use these objects to run a local solver multiple times. The local solver has its own options.
For more information, see the Classes documentation.