Global Optimization Toolbox Solver Characteristics
Solver Choices
This section describes Global Optimization Toolbox solver characteristics. The section includes recommendations for obtaining results more effectively.
To achieve better or faster solutions, first try tuning the recommended solvers by setting appropriate options or bounds. If the results are unsatisfactory, try other solvers.
Desired Solution | Smooth Objective and Constraints | Nonsmooth Objective or Constraints |
---|---|---|
Explanation of “Desired Solution” | Choosing Between Solvers for Smooth Problems | Choosing Between Solvers for Nonsmooth Problems |
Single local solution | Optimization Toolbox™ functions; see Optimization Decision Table | fminbnd , patternsearch ,
fminsearch , ga ,
particleswarm , simulannealbnd ,
surrogateopt |
Multiple local solutions | GlobalSearch , MultiStart | patternsearch , ga ,
particleswarm , simulannealbnd , or
surrogateopt started from multiple initial points
x0 or from multiple initial populations |
Single global solution | GlobalSearch , MultiStart ,
patternsearch , particleswarm ,
ga , simulannealbnd ,
surrogateopt | patternsearch , ga ,
particleswarm , simulannealbnd ,
surrogateopt |
Single local solution using parallel processing | MultiStart , Optimization Toolbox functions | patternsearch , ga ,
particleswarm , surrogateopt |
Multiple local solutions using parallel processing | MultiStart | patternsearch , ga ,
or particleswarm started from multiple initial
points x0 or from multiple initial populations |
Single global solution using parallel processing | MultiStart | patternsearch , ga ,
particleswarm , surrogateopt |
Explanation of “Desired Solution”
To understand the meaning of the terms in “Desired Solution,” consider the example
f(x)=100x2(1–x)2–x,
which has local minima x1
near 0 and x2
near
1:
Code for generating the figure
The minima are located at:
fun = @(x)(100*x^2*(x - 1)^2 - x); x1 = fminbnd(fun,-0.1,0.1) x1 = 0.0051 x2 = fminbnd(fun,0.9,1.1) x2 = 1.0049
Description of the Terms
Term | Meaning |
---|---|
Single local solution | Find one local solution, a point x where
the objective function f(x)
is a local minimum. For more details, see Local vs. Global Optima. In the example, both x1 and x2 are
local solutions. |
Multiple local solutions | Find a set of local solutions. In the example, the complete
set of local solutions is {x1,x2} . |
Single global solution | Find the point x where the objective function f(x)
is a global minimum. In the example, the global solution is x2 . |
Choosing Between Solvers for Smooth Problems
Single Global Solution
Try
GlobalSearch
first. It is most focused on finding a global solution, and has an efficient local solver,fmincon
.Try
MultiStart
next. It has efficient local solvers, and can search a wide variety of start points.Try
patternsearch
next. It is less efficient, since it does not use gradients. However,patternsearch
is robust and is more efficient than the remaining local solvers To search for a global solution, startpatternsearch
from a variety of start points.Try
surrogateopt
next.surrogateopt
attempts to find a global solution using the fewest objective function evaluations.surrogateopt
has more overhead per function evaluation than most other solvers.surrogateopt
requires finite bounds, and accepts integer constraints, linear constraints, and nonlinear inequality constraints.Try
particleswarm
next, if your problem is unconstrained or has only bound constraints. Usually,particleswarm
is more efficient than the remaining solvers, and can be more efficient thanpatternsearch
.Try
ga
next. It can handle all types of constraints, and is usually more efficient thansimulannealbnd
.Try
simulannealbnd
last. It can handle problems with no constraints or bound constraints.simulannealbnd
is usually the least efficient solver. However, given a slow enough cooling schedule, it can find a global solution.
Multiple Local Solutions
GlobalSearch
and MultiStart
both
provide multiple local solutions. For the syntax to obtain multiple
solutions, see Multiple Solutions. GlobalSearch
and MultiStart
differ
in the following characteristics:
MultiStart
can find more local minima. This is becauseGlobalSearch
rejects many generated start points (initial points for local solution). Essentially,GlobalSearch
accepts a start point only when it determines that the point has a good chance of obtaining a global minimum. In contrast,MultiStart
passes all generated start points to a local solver. For more information, see GlobalSearch Algorithm.MultiStart
offers a choice of local solver:fmincon
,fminunc
,lsqcurvefit
, orlsqnonlin
. TheGlobalSearch
solver uses onlyfmincon
as its local solver.GlobalSearch
uses a scatter-search algorithm for generating start points. In contrast,MultiStart
generates points uniformly at random within bounds, or allows you to provide your own points.MultiStart
can run in parallel. See How to Use Parallel Processing in Global Optimization Toolbox.
Choosing Between Solvers for Nonsmooth Problems
Choose the applicable solver with the lowest number. For problems
with integer constraints, use ga
.
Use
fminbnd
first on one-dimensional bounded problems only.fminbnd
provably converges quickly in one dimension.Use
patternsearch
on any other type of problem.patternsearch
provably converges, and handles all types of constraints.Try
surrogateopt
for problems that have time-consuming objective functions.surrogateopt
searches for a global solution.surrogateopt
requires finite bounds, and accepts integer constraints, linear constraints, and nonlinear inequality constraints.Try
fminsearch
next for low-dimensional unbounded problems.fminsearch
is not as general aspatternsearch
and can fail to converge. For low-dimensional problems,fminsearch
is simple to use, since it has few tuning options.Try
particleswarm
next on unbounded or bound-constrained problems.particleswarm
has little supporting theory, but is often an efficient algorithm.Try
ga
next.ga
has little supporting theory and is often less efficient thanpatternsearch
orparticleswarm
.ga
handles all types of constraints.ga
andsurrogateopt
are the only Global Optimization Toolbox solvers that accept integer constraints.Try
simulannealbnd
last for unbounded problems, or for problems with bounds.simulannealbnd
provably converges only for a logarithmic cooling schedule, which is extremely slow.simulannealbnd
takes only bound constraints, and is often less efficient thanga
.
Solver Characteristics
Solver | Convergence | Characteristics |
---|---|---|
GlobalSearch | Fast convergence to local optima for smooth problems | Deterministic iterates |
Gradient-based | ||
Automatic stochastic start points | ||
Removes many start points heuristically | ||
MultiStart | Fast convergence to local optima for smooth problems | Deterministic iterates |
Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
Gradient-based | ||
Stochastic or deterministic start points, or combination of both | ||
Automatic stochastic start points | ||
Runs all start points | ||
Choice of local solver: fmincon , fminunc , lsqcurvefit ,
or lsqnonlin | ||
patternsearch | Proven convergence to local optimum; slower than gradient-based solvers | Deterministic iterates |
Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
No gradients | ||
User-supplied start point | ||
surrogateopt | Proven convergence to global optimum for bounded problems; slower than gradient-based solvers; generally stops by reaching a function evaluation limit or other limit | Stochastic iterates |
Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
Best used for time-consuming objective functions | ||
Requires bound constraints, accepts linear constraints and nonlinear inequality constraints | ||
Allows integer constraints; see Mixed-Integer Surrogate Optimization | ||
No gradients | ||
Automatic start points or user-supplied points, or a combination of both | ||
particleswarm | No convergence proof | Stochastic iterates |
Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
Population-based | ||
No gradients | ||
Automatic start population or user-supplied population, or a combination of both | ||
Only bound constraints | ||
ga | No convergence proof | Stochastic iterates |
Can run in parallel; see How to Use Parallel Processing in Global Optimization Toolbox | ||
Population-based | ||
No gradients | ||
Allows integer constraints; see Mixed Integer ga Optimization | ||
Automatic start population or user-supplied population, or a combination of both | ||
simulannealbnd | Proven to converge to global optimum for bounded problems with very slow cooling schedule | Stochastic iterates |
No gradients | ||
User-supplied start point | ||
Only bound constraints |
Explanation of some characteristics:
Convergence — Solvers can fail to converge to any solution when started far from a local minimum. When started near a local minimum, gradient-based solvers converge to a local minimum quickly for smooth problems.
patternsearch
provably converges for a wide range of problems, but the convergence is slower than gradient-based solvers. Bothga
andsimulannealbnd
can fail to converge in a reasonable amount of time for some problems, although they are often effective.Iterates — Solvers iterate to find solutions. The steps in the iteration are iterates. Some solvers have deterministic iterates. Others use random numbers and have stochastic iterates.
Gradients — Some solvers use estimated or user-supplied derivatives in calculating the iterates. Other solvers do not use or estimate derivatives, but use only objective and constraint function values.
Start points — Most solvers require you to provide a starting point for the optimization in order to obtain the dimension of the decision variables.
ga
andsurrogateopt
do not require any starting points, because they take the dimension of the decision variables as an input or infer dimensions from bounds. These solvers generate a start point or population automatically, or they accept a point or points that you supply.
Compare the characteristics of Global Optimization Toolbox solvers to Optimization Toolbox solvers.
Solver | Convergence | Characteristics |
---|---|---|
fmincon , fminunc , fseminf , lsqcurvefit , lsqnonlin | Proven quadratic convergence to local optima for smooth problems | Deterministic iterates |
Gradient-based | ||
User-supplied starting point | ||
fminsearch | No convergence proof — counterexamples exist. | Deterministic iterates |
No gradients | ||
User-supplied start point | ||
No constraints | ||
fminbnd | Proven convergence to local optima for smooth problems, slower than quadratic. | Deterministic iterates |
No gradients | ||
User-supplied start interval | ||
Only one-dimensional problems |
All these Optimization Toolbox solvers:
Have deterministic iterates
Require a start point or interval
Search just one basin of attraction
Why Are Some Solvers Objects?
GlobalSearch
and MultiStart
are
objects. What does this mean for you?
You create a
GlobalSearch
orMultiStart
object before running your problem.You can reuse the object for running multiple problems.
GlobalSearch
andMultiStart
objects are containers for algorithms and global options. You use these objects to run a local solver multiple times. The local solver has its own options.
For more information, see the Classes documentation.