The software described in this document is furnished under a license agreement. The software may be used
or copied only under the terms of the license agreement. No part of this manual may be photocopied or
reproduced in any form without prior written consent from The MathW orks, Inc.
FEDERAL ACQUISITION: This provision applies to all acquisitions of the Program and Documentation
by, for, or through the federal government of the United States. By accepting delivery of the Program
or Documentation, the government hereby agrees that this software or documentation qualifies as
commercial computer software or commercial computer software documentation as such terms are used
or defined in FAR 12.212, DFARS Part 227.72, and DFARS 252.227-7014. Accordingly, the terms and
conditions of this Agreement and only those rights specified in this Agreement, shall pertain to and govern
theuse,modification,reproduction,release,performance,display,anddisclosureoftheProgramand
Documentation by the federal government (or other entity acquiring for or through the federal government)
and shall supersede any conflicting contractual terms or conditions. If this License fails to meet the
government’s needs or is inconsistent in any respect with federal procurement law, the government agrees
to return the Program and Docu mentation, unused, to The MathWorks, Inc.
Trademarks
MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See
www.mathworks.com/trademarks for a list of additional trademarks. Other product or brand
names may be trademarks or registered trademarks of their respective holders.
Patents
The MathWorks products are protected by one or more U.S. patents. Please see
www.mathworks.com/patents for more information.
Revision History
January 2004Online onlyNew for Version 1.0 ( Release 13SP1+)
June 2004First printingRevised for Version 1.0.1 (Release 14)
October 2004Online onlyRevised for Version 1.0.2 (Release 14SP1)
March 2005Online onlyRevised for Version 1.0.3 (Release 14SP2)
September 2005 Second printingRevised for Version 2.0 (Release 14SP3)
March 2006Online onlyRevised for Version 2.0.1 (Release 2006a)
September 2006 Online onlyRevised for Version 2.0.2 (Release 2006b)
March 2007Online onlyRevised for Version 2.1 (Release 2007a)
September 2007 Third printingRevised for Version 2.2 (Release 2007b)
March 2008Online onlyRevised for Version 2.3 (Release 2008a)
October 2008Online onlyRevised for Version 2.4 (Release 2008b)
March 2009Online onlyRevised for Version 2.4.1 (Release 2009a)
September 2009 Online onlyRevised for Version 2.4.2 (Release 2009b)
March 2010Online onlyRevised for Version 3.0 (Release 2010a)
Objective (Fitness) Functions
Example: Writing a Function File
Example: Writing a Vectorized Function
Gradients and Hessians
Maximizing vs. Minimizing
........................2-2
....................2-2
..............2-3
............................2-4
.........................2-5
Constraints
Set Bounds
.......................................2-6
.......................................2-6
v
Gradients and Hessians ............................2-6
Vectorized Constraints
.............................2-7
Using GlobalSearch and MultiStart
3
How to Optimize with GlobalSearch and MultiStart ..3-2
Problems You Can Solve with GlobalS earch and
MultiStart
Outline of Steps
Determining Inputs for the Problem
Create a Problem Structure
Create a Solver Object
Set Start Points for MultiStart
Run the Solver
.....................................3-2
...................................3-2
..................3-4
.........................3-4
............................. 3-13
...................... 3-16
....................................3-19
Examining Results
Single Solution
Multiple Solutions
Iterative Display
Global Output Structures
Example: Visualizing the Basins of Attraction
How GlobalSearch and MultiStart W ork
Multiple Runs of a Local Solver
Differences Between the Solver Objects
GlobalSearch Algorithm
MultiStart Algorithm
Bibliography
Improving Results
Can You Certify a Solution Is Global?
Refining the Start Points
Changing Options
Reproducing Results
GlobalSearch and MultiStart Examples
Example: Finding Global or Multiple Local Minima
Example: UsingOnlyFeasibleStartPoints
What Is Direct Search? .............................4-2
Performing a Pattern Search
Calling patternsearch at the Command Line
Using the Optimization Tool for Pattern Search
Example: Finding the Minimum of a Function Using
the G PS Algorithm
Objective Function
Finding the Minimum of the Function
Plotting the Objective Function Values and Mesh Sizes
Pattern Search Terminology
Patterns
Meshes
Polling
Expanding and Contracting
How P attern Search Works
Context
Successful Polls
An Un successful Poll
Displaying the Results at Each Iteration
More Iterations
Stopping Conditions for the Pattern Search
.........................................4-11
..........................................4-12
..........................................4-13
..........................................4-14
..............................4-7
................................4-7
...................................4-15
............................... 4-18
...................................4-20
.......................4-3
...........4-3
.........4-3
.................4-8
........................ 4-11
......................... 4-13
......................... 4-14
..............4-19
............4-21
..4-9
Description of the Nonlinear Constraint Solver
Performing a Pattern Search Using the Optimization
Tool GUI
Example: A Linearly Constrained Problem
Displaying Plots
........................................4-26
.............4-26
..................................4-29
......4-24
vii
Example: Working with a Custom Plot Function ........4-30
Performing a Pattern Search from the Command
Line
Calling patternsearch with the Default Options
Setting Options for patternsearch at the Command Line
Using Options and Problems from the Optimization
............................................4-36
.........4-36
Tool
..........................................4-40
..4-38
Pattern Search Examples: Setting Options
Poll Method
Complete Poll
Example: Comparing the Efficiency of Poll Options
Using a Search Method
Mesh Expansion and Contraction
Mesh A ccelerator
Using Cache
Setting Tolerances for the Solver
Constrained Minimization Using patternsearch
Vectorizing the Objective and Constraint Functions
......................................4-42
....................................4-44
............................. 4-55
.................... 4-59
..................................4-64
......................................4-65
..................... 4-68
...........4-42
......4-48
.........4-69
Using the Genetic Algorithm
5
What Is the Genetic Algorithm? .....................5-2
Performing a Genetic Algorithm Optimization
Calling the Function ga at the Command Line
Using the Optimization Tool
........................5-4
.......5-3
..........5-3
.....4-72
viiiContents
Example: Rastrigin’s Function
Rastrigin’s Function
Finding the Minimum of Rastrigin’s Function
Finding the Minimum from the Command Line
Displaying Plots
Populations and Generations
Diversity
FitnessValuesandBestFitnessValues
Parents and Children
........................................5-18
.............................. 5-19
........................ 5-18
...............5-19
How the Genetic Algorithm Works
Outline of the Algorithm
Initial Population
Creating the Next Generation
Plots of Later Generations
Stopping Conditions for the Algorithm
Description of the Nonlinear Constraint Solver
Genetic Algorithm Optimizations Using the
Optimization Tool GUI
Introduction
Displaying Plots
Example: Creating a Custom Plot Function
Reproducing Your Results
Example: Resuming the Genetic Algorithm from the Final
Population
Using the Genetic Algorithm from the Command
Line
Running ga with the Default Options
Setting Options for ga at the Com mand Line
Using Options and Problems from the Optimization
Reproducing Your Results
Resuming ga from the Final Population of a Previous
Running ga From a File
............................................5-39
Tool
Run
......................................5-29
..................................5-29
.....................................5-34
..........................................5-43
..........................................5-45
............................ 5-20
.................................5-21
.......................... 5-24
........................... 5-29
.......................... 5-33
.......................... 5-44
............................ 5-46
.................. 5-20
....................... 5-22
................ 5-24
............5-30
................. 5-39
...........5-40
......5-27
Genetic Algorithm Examples
Improving Your Results
Population Diversity
Fitness Scaling
Selection
Reproduction Op tions
Mutation and Crossover
.........................................5-62
...................................5-59
............................ 5-49
............................... 5-49
.............................. 5-63
............................ 5-63
....................... 5-49
ix
Setting the Amount of Mutation .....................5-64
Setting the Crossover Fraction
Comparing Results for Varying Crossover Fractions
Example: Global vs. Local Minima with GA
Using a Hybrid Function
Setting the Maximum Number of Generations
Vectorizing the Fitness Function
Constrained Minimization Using ga
....................... 5-66
............5-72
........................... 5-76
..........5-80
..................... 5-81
.................. 5-82
Using Simulated Annealing
6
What Is Simulated Annealing? ......................6-2
.....5-70
Performing a Simulated Annealing Optimization
Calling simulannealbnd at the Command Line
Using the Optimization Tool
Example: Minimizing De Jong’s Fifth Function
Description
Minimizing at the Command Line
Minimizing Using the Optimization Tool
Some Simulated Annealing Terminology
Objective Function
Temperature
Annealing Schedule
Reannealing
How Sim ulated Annealing Works
Outline of the Algorithm
Stopping Conditions for the Algorithm
Using Simulated Annealing from the Comm and Line
Running simulannealbnd With the Default Options
Setting Options for simulannealbnd at the Command
Types of Parallel Processing in Global Optimization
Toolbox
How Toolbox Functions Distribute Processes
How to Use Parallel Processing
Multicore Processors
Processor N etw ork
Parallel Search Functions or Hybrid Functions
........................................8-2
...........8-3
..................... 8-11
............................... 8-11
................................8-12
.........8-14
Options Reference
9
GlobalSearch and MultiStart Properties (Options) ....9-2
xi
How to Set Properties ..............................9-2
Properties of Both Objects
GlobalSearch Properties
MultiStart Properties
..........................9-2
............................9-4
..............................9-6
Pattern Search Options
Optimization T ool vs. Command Line
Plot Options
Poll Options
Search Options
Mesh Options
Algorithm Settings
Cache Options
Stopping Criteria
Output Function Options
Display to Command Window Options
Vectorize and Parallel Options (User Function
Evaluation)
Options Table for Pattern Search Algorithms
Genetic Algorithm Options
Optimization T ool vs. Command Line
Plot Options
Population Op tions
Fitness Scaling Options
Selection Options
Reproduction Op tions
Mutation Options
Crossover Options
Migration Options
Algorithm Settings
Multiobjective Options
Hybrid Function Options
Stopping Criteria Options
Output Function Options
Display to Command Window Options
Vectorize and Parallel Options (User Function
Evaluation)
......................................9-8
......................................9-10
...................................9-13
.....................................9-17
....................................9-19
..................................9-20
....................................9-24
......................................9-30
..................................9-37
....................................9-51
............................9-7
.................9-7
................................9-19
........................... 9-21
................ 9-23
......................... 9-29
................. 9-29
................................9-33
............................ 9-36
.............................. 9-39
.................................9-40
.................................9-42
.................................9-45
................................9-46
............................. 9-47
........................... 9-47
........................... 9-48
........................... 9-48
................ 9-50
...........9-25
xiiContents
Simulated Annealing Options
saoptimset At The Command Line
Plot Options
Temperature Options
• “Example: Comparing Several Solvers” on page 1-4
• “What Is Global Optimization?” on page 1-12
1
• “Choosing a Solver” on page 1-17
1 Introducing Global O ptimiza tion Toolbox Functions
Product Overview
Global Optimization Toolbox provides methods that search for global
solutions to problems that contain multiple maxima or minima. It includes
global search, multistart, pattern search, genetic algorithm, and simulated
annealing solve rs. You can use these solvers to solve optimization problems
where the objective or constraint function is continuous, discontinuous,
stochastic, does not possess derivatives, or includes simulations or black-box
functions with undefined values for some parameter settings.
Genetic algorithm and pattern search solvers support algorithmic
customization. You can create a custom genetic algorithm variant by
modifying initial population and fitness scaling options or by defining parent
selection, crossover, and m utation functions. You can customize pattern
search by defining polling, searching, and other functions.
Key features:
• Interactive tools for defining and solving optimization problems and
monitoring solution progress
1-2
• Global search and multistart solvers for finding single or multiple global
optima
• Genetic algorithm solver that supports linear, nonlinear, and bound
constraints
• Multiobjectiv e genetic algorithm with Pareto-front identification, including
linear and bound constraints
• Pattern search solver that supports linear, nonlinear, and bound
constraints
• Simulated annealing tools that implement a random search method,
with options for defining annealing process, temperature schedule, and
acceptance criteria
• Parallel computing support in multistart, genetic algorithm, and pattern
search solver
• Custom data type support in genetic algorithm, multiobjective genetic
algorithm, and simulated annealing solvers
Product Overview
Implementation
Global Optimiza
solution. Howev
as global.
You can extend
writing your o
with the MATLA
tion Toolbox solvers repeatedly attempt to locate a global
er, no solver employs an algorithm that can certify a solution
the capabilities of Global Optimization Toolbox functions by
wn files, by using them in combination with other toolboxes, or
®
B
or Simulink®environments.
Notes
1-3
1 Introducing Global O ptimiza tion Toolbox Functions
Example: Com parin g Several Solvers
In this section...
“Function to Optimize” on page 1-4
“Four Solution Methods” on page 1-5
“Comparison of Syntax and Solutions” on page 1-10
Function to Optimize
This example shows how to minimi z e Rastrigin’s function with four solvers.
Each solver has its own characteristics. The characteristics lead to different
solutions and run times. The results, examined in “Comparison of Syntax
and Solutions” on page 1-10, can help you choose an appropriate solver for
your own problems.
Rastrigin’s function has many local minima, with a global minimum at (0,0):
1-4
Ras( )coscos.xxxx x=++−+
Usuall
functi
starts
minim
The
ra
with G
ed version of Rastrigin’s function with larger basins of attraction. For
scal
rmation, see “Basins of Attraction” on page 1-13.
info
rf2 = @(x)rastriginsfcn(x/10);
201022
y you don’t know the location of the global minimum of your objective
on. To show how the solvers look for a global solution, this example
all the solvers around the point
um.
striginsfcn.m
lobal Optimization Toolbox software. This example employs a
2
122
file implements Rastrigin’s function. This file comes
ππ
()
12
[20,30], which is far from the global
Example: Comparing Several Solvers
This example minimizes rf2 using the default settings of fminunc (an
Optimization Toolbox™ solver),
uses
ga with a nondefault setting, to obtain an initial population around
the point
[20,30].
patternsearch,andGlobalSearch.Italso
Four Solution Methods
• “fminunc” on page 1-6
• “patternsearch” on page 1-7
• “ga” on page 1-8
1-5
1 Introducing Global O ptimiza tion Toolbox Functions
• “GlobalSearch” on page 1-9
fminunc
To solve the optimization problem using the fminunc Optimization Toolbox
solver, enter:
rf2 = @(x)rastriginsfcn(x/10); % objective
x0 = [20,30]; % start point away from the minimum
[xf ff flf of] = fminunc(rf2,x0)
fminunc
returns
Warning: Gradient must be provided for trust-region algorithm;
using line-search algorithm instead.
> In fminunc at 347
Local minimum found.
Optimization completed because the size of the gradient is
less than the default value of the function tolerance.
xf =
19.899129.8486
ff =
12.9344
flf =
1
of =
iterations: 3
funcCount: 15
stepsize: 1
firstorderopt: 5.9907e-009
algorithm: 'medium-scale: Quasi-Newton line search'
message: [1x438 char]
1-6
• xf is the minimizing point.
ff isthevalueoftheobjective,rf2,atxf.
•
flf is the exit flag. An exit flag of 1 indicates xf is a local minimum.
•
Example: Comparing Several Solvers
• of is the output structure, which describes the fminunc calculations
leading to the solution.
patternsearch
To solve the optimization problem using the patternsearch Global
Optimization Toolbox solver, ente r:
rf2 = @(x)rastriginsfcn(x/10); % objective
x0 = [20,30]; % start point away from the minimum
[xp fp flp op] = patternsearch(rf2,x0)
patternsearch
Optimization terminated: mesh size less than options.TolMesh.
xp =
19.8991-9.9496
fp =
4.9748
flp =
1
op =
problemtype: 'unconstrained'
searchmethod: []
returns
function: @(x)rastriginsfcn(x/10)
pollmethod: 'gpspositivebasis2n'
iterations: 48
funccount: 174
meshsize: 9.5367e-007
message: 'Optimization terminated: mesh size less than
options.TolMesh.'
• xp is the minimizing point.
fp isthevalueoftheobjective,rf2,atxp.
•
flp is the exit flag. An exit flag of 1 indicates xp is a local minimum.
•
op is the output structure, which describes the patternsearch calculations
•
leading to the solution.
1-7
1 Introducing Global O ptimiza tion Toolbox Functions
ga
To solve the optimization problem using the ga Global Optimization Toolbox
solver, enter:
rf2 = @(x)rastriginsfcn(x/10); % objective
x0 = [20,30]; % start point away from the minimum
initpop = 10*randn(20,2) + repmat([10 30],20,1);
opts = gaoptimset('InitialPopulation',initpop);
[xga fga flga oga] = ga(rf2,2,[],[],[],[],[],[],[],opts)
initpop
each elemen t is normally distributed with standard deviation
initpop form an initial population ma t rix for the ga solver.
opts is an optimization structure setting initpop as the initia l population.
The final line calls
ga uses random numbers, and produces a random result. In this case ga
is a 20-by-2 matrix. Each row of initpop has mean [10,30],and
10.Therowsof
ga, using the optimization structure.
returns:
Optimization terminated: average change in the fitness value
less than options.TolFun.
xga =
-0.001619.8821
fga =
3.9804
flga =
1
oga =
problemtype: 'unconstrained'
rngstate: [1x1 struct]
generations: 54
funccount: 1100
message: [1x86 char]
1-8
• xga is the minimizing point.
fga is the value of the objective, rf2,atxga.
•
flga is the exit flag. An exit flag of 1 indicates xga is a local minimum.
•
Example: Comparing Several Solvers
• oga is the output structure, which describes the ga calculations leading
to the solution.
GlobalSearch
To so lv e the optimization problem using the GlobalSearch solver, e nter:
rf2 = @(x)rastriginsfcn(x/10); % objective
x0 = [20,30]; % start point away from the minimum
problem = createOptimProblem('fmincon','objective',rf2,...
is an optimization problem structure. problem specifies the fmincon
solver, the rf2 objective function, and x0 = [20,30]. For more information
on using
createOptimProblem, see “Create a Problem Structure” on page 3-4.
Note You must specify fmincon as the solver for GlobalSearch,evenfor
unconstrained problems.
gs is a default GlobalSearch object. The object contains options for solv ing
the problem. Calling
run(gs,problem) runs problem from mu ltiple start
points. The start points are random, so the following result is also random.
In this case, the run returns:
GlobalSearch stopped because it analyzed all the trial points.
All 6 local solver runs converged with a positive local solver exit flag.
xg =
1.0e-007 *
-0.1405-0.1405
fg =
0
flg =
1
og =
1-9
1 Introducing Global O ptimiza tion Toolbox Functions
funcCount: 2312
localSolverTotal: 6
localSolverSuccess: 6
localSolverIncomplete: 0
localSolverNoSolution: 0
message: [1x137 char]
• xg is the minimizing point.
fg isthevalueoftheobjective,rf2,atxg.
•
flg is the exit flag. An exit flag of 1 indicates all fmincon runs converged
•
properly.
•
og is the output structure, which describes the GlobalSearch calculations
leading to the solution.
Comparison of Syntax and Solutions
One s olution is better than another if its objective function value is smaller
than the other. The following table summarizes the results, accurate to one
decimal.
1-10
Results
solution
objective
#Fevals
fminunc
[19.9 29.9][19.9 -9.9][0 19.9][0 0]
12.9540
1517410402312
patternsearch
ga
GlobalSearch
Theseresultsaretypical:
•
fminunc quickly reaches the local solution within its starting basin, b ut
does not explore outside this basin at all.
fminunc has simple calling
syntax.
•
patternsearch takes more function evaluations than fminunc,and
searches through several basins, arriving at a better solution than
patternsearch calling syntax is the same as that of fminunc.
ga takes many more function evaluations than patternsearch.Bychance
•
it arrived at a slightly better solution.
ga is stochastic, so its results change
fminunc.
Example: Comparing Several Solvers
with every run. ga has simple calling syntax, but there are extra steps to
have an initial population near
GlobalSearch run takes many more function evaluations than
•
patternsearch, searches many basins, and arrives at an even better
solution. In this case,
up
GlobalSearch is more involved than setting up the other solvers. As
GlobalSearch found the global optimum. Setting
the example shows, before calling
a
GlobalSearch object (gs in the example), and a problem structure
(
problem). Then, call the run method with gs and problem.For
more details on how to run
[20,30].
GlobalSearch, you must create both
GlobalSearch, see “How to Optimize with
GlobalSearch and MultiStart” on page 3-2.
1-11
1 Introducing Global O ptimiza tion Toolbox Functions
What Is Global Optimization?
In this section...
“Local vs. G lobal Optima” on page 1-12
“Basins of Attraction” on page 1-13
Local vs. Global Optima
Optimization is the process of finding the point that minimizes a function.
More specifically:
• A local minimum of a function is a point where the function value is
smaller than or equal to the value at nearby points, but possibly greater
than at a distant point.
• A global minimum is a point where the function value is smaller than or
equal to the value at all other feasible points.
1-12
Global minimum
Local minimum
Generally, Optimization Toolbox solvers find a local optimum. (This local
optimum can be a global optimum.) They find the optimum in the basinof attraction of the starting point. For more information, se e “Basins of
Attraction” on page 1-13.
In contrast, Global Optimization Toolbox solvers are designed to search
through more than one basin of attraction. They search in various ways:
•
GlobalSearch and MultiStart generate a number of starting points. They
then use a local solver to find the optim a in the basins of attraction of the
starting points.
What Is Global Optimization?
• ga uses a set of starting points (called the population) and iteratively
generates better points from the population. As long as the initial
population covers several basins,
simulannealbnd performs a random search. Generally, simulannealbnd
•
accepts a point if it is better than the previous point. simulannealbnd
occasionally accepts a worse point, in order to reach a different basin.
patternsearch looks at a number of neighboring points before accepting
•
one of them. If some neighboring points belong to different basins,
patternsearch in essence looks in a number of basins at once.
ga can examine several basins.
Basins of Attraction
If an objective function f(x) is smooth, the vector –∇f(x) points in the direction
where f(x) decreases most quickly. The equation of steepest descent, namely
d
xtf xt( )( ( )),=−∇
dt
yields a path x(t) that goes to a local minimu m as t gets large. Generally,
initial values x(0) that are close to each other give steepest descent paths that
tend to the same minimum point. The basin of attraction for steepes t descent
is the set of initial values leading to the same local minimum.
The following figure shows two one-dimensional minima. The figure shows
different basins of a ttra ction with different lin e styles, and it shows directions
of steepest descent with arrows. For this and subsequent f igures, black dots
represent local minima. Every steepest descent path, starting at a point x(0),
goes to the black dot in the basin containing x(0).
f(x)
x
1-13
1 Introducing Global O ptimiza tion Toolbox Functions
The following figure shows how steepest descent paths can be more
complicated in more dimensions.
1-14
The following figure shows eve n more complicated paths and basins of
attraction.
What Is Global Optimization?
Constraints can break up one basin of attraction into several pieces. For
example, consider minimizing y subject to:
• y ≥ |x|
• y ≥ 5–4(x–2)
2
.
The figure shows the two basins of attraction with the final points.
1-15
1 Introducing Global O ptimiza tion Toolbox Functions
1-16
The steepest descent paths are straight lines down to the constraint
boundaries. From the constraint boundaries, the steepest descent paths
travel down along the boundaries. The final point is either (0,0) or (11/4,11/4),
depending on whether the initial x-value is above or below 2.
Choosing a Solver
In this section...
“Table for Choosing a Solver” on page 1-17
“Solver Characteristics” on page 1-22
“WhyAreSomeSolversObjects?”onpage1-24
Table for Choosing a Solver
There are six Global Optimization Toolbox solvers:
•
ga (Genetic Algorithm)
GlobalSearch
•
• MultiStart
• patternsearch, also called direct search
Choosing a Solver
simulannealbnd (Simulated Annealing)
•
gamultiobj, which is not a minimizer; see C hapter 7, “Multiobjective
•
Optimization”
Choose an optimizer based on problem characteristics and on the type of
solution you want. “Solver Characteristics” on page 1-22 contains more
information that can help you decide which solver is likely to be most suitable.
• “Explanation of “Desired Solution”” on page 1-18
• “Choosing Between Solvers for Smooth Problems” on page 1-20
• “Choosing Between Solvers for Nonsmooth Problems” on page 1-21
1-17
1 Introducing Global O ptimiza tion Toolbox Functions
Desired SolutionSmooth Objective and
Constraints
Single local solutionOptimization Toolbox functions;
see “Optimization Decision
Table” in the Opti mization
Toolbox documentation
Find one local solution, a point x where the
objective function f(x) is a local minimum. F or
more details, see “Local vs. G lobal Optima” on
page 1-12. In the example, both
local solutions.
Multiple local solutions
Find a set of local solutions. In the example, the
complete set of local solutions is
x1 and x2 are
{x1,x2}.
1-19
1 Introducing Global O ptimiza tion Toolbox Functions
Description of the Terms (Continued)
Term
Meaning
Single global solutionFind the point x where the obje ctive function f(x)
is a global minimum. In the example, the global
solution is
x2.
Choosing Between Solvers for Smooth Problems
Single Gl
1 Try Globa
has an eff
2 Try MultiStart second. It has efficient local solvers, and can search a
wide variety of start points.
3 Try patternsearch third. It is less efficient, since it does not use gradients.
However,
local solvers.
4 Try ga
effic
5 Try simulannealbnd last. It can handle only unconstrained or bound
constraints.
given a slow enough cooling schedule, it can find a global solution.
obal Solution.
lSearch
icient local solver,
patternsearch is robust and is more efficient than the remaining
first. It is most focused on finding a global solution, and
fmincon.
fourth. It can handle all types of constraints, and is usually more
ient than
simulannealbnd.
simulannealbnd is usually the least efficient solver. However,
1-20
Multiple Local Solutions.
GlobalSearch and MultiStart both provide
multiple local solutions. For the syntax to obtain multiple solutions, see
“Multiple Solutions” on page 3-24.
GlobalSearch and MultiStart differ in
the following characteristics:
•
MultiStart can find more local minima. This is because GlobalSearch
rejects many generated start points (initial points for local solution).
Essentially,
GlobalSearch accepts a start point only when it determines
that the point has a good chance of obtaining a global minimum. In
contrast,
MultiStart passes all generated start points to a local solver. For
more information, see “GlobalSearch Algorithm” on page 3-37.
Choosing a Solver
• MultiStart offers a choice of local solver: fmincon, fminunc, lsqcurvefit,
or
lsqnonlin.TheGlobalSearch solver uses only fmincon as its local
solver.
•
GlobalSearch uses a scatter-search algorithm for generating start points.
In contrast,
MultiStart generates p oints uniformly at random within
bounds, or allows you to provide your own points.
•
MultiStart can run in parallel. See “How to Use Parallel Processing”
on page 8-11.
Choosing Between Solvers for Nonsmooth Problems
Choose the applicable solver with the lowest number:
1 Use fminbnd first on one-dimensional bounded problems only. fminbnd
provably converges quickly in one dimension.
2 Use patternsearch on any other type of problem. patternsearch provably
converges, and handles all types of constraints.
3 Try fminsearch next for low-dimensional unbounded problems.
fminsearch is not as general as patternsearch and can fail to converge.
For low-dimensional problems,
fminsearch is simple to use, since it has
few tuning options.
4 Try ga next. ga has little supporting theory and is often less efficient than
patternsearch. It handles all types of constraints.
5 Try simulannealbnd last for unbounded problems, or for problems with
bounds.
schedule, which is extremely slow.
constraints, and is often less efficient than
simulannealbnd provably converges only for a logarithmic cooling
simulannealbnd takes only bound
ga.
1-21
1 Introducing Global O ptimiza tion Toolbox Functions
Solver
GlobalSearch
MultiStart
tternsearch
pa
Solver Characte
Convergence
Fast convergen
optima for smo
Fast convergence to local
optima for smooth problems .
Proven convergence to
local optimum, slower than
gradient-based solvers.
ristics
ce to local
oth problems.
Characteristics
Deterministic
Gradient-bas
Automatic st
Removes many start points
heuristically
Deterministic iterates
Can run in parallel; see “How to Use
Parallel Processing” on page 8-11
Gradient-based
Stochastic or deterministic start
points, or combination of both
Automatic stochastic start points
all start points
Runs
ce of local solver:
Choi
unc
fmin
nonlin
lsq
Deterministic iterates
Can run in parallel; see “How to Use
Parallel Processing” on page 8-11
iterates
ed
ochastic start points
fmincon,
, lsqcurvefit,or
1-22
No gradients
User-supplied start point
Choosing a Solver
Solver
ga
simulannealbnd
Convergence
No convergence proof.
Characteristics
Stochastic iterates
Can run in parallel; see “How to Use
Parallel Processing” on page 8-11
Population-based
No gradients
Automatic start population, or
user-supplied population, or
combination of both
Proven to converge to global
optimum for bounded problems
with very slow cooling
schedule.
Stochastic iterates
No gradients
User-supplied start point
Only bound c on stra ints
Explanation of some characteristics:
• Convergence — Solvers can fail to converge to any solution when
started far from a local minimum. When started near a local minimum,
gradient-based solvers converge to a local minimum quickly for smooth
problems.
but the convergence is slower than gradient-based solvers. Both
simulannealbnd can fail to converge in a reasonable amount of time for
patternsearch provably converges for a wide range of problems,
ga and
some problems, although they are often e ffective.
• Iterates — Solvers iterate to find solutions. The steps in the ite ration are
iterates. Some solvers have deterministic iterates. Others use random
numbers and have stochastic iterates.
• Gradients—Somesolversuseestimated or user-supplied derivatives in
calculating the iterates. Other solvers do not use or estimate derivatives,
but use only objective and constraint function values.
• Startpoints—Mostsolversrequireyou to provide a starting point for
the optimization. One reason they require a start point is to obtain the
dimension of the decision variables.
because it takes the dimension of the decision variables as an input.
ga does not require any starting points,
ga can
generate its start population automatically.
1-23
1 Introducing Global O ptimiza tion Toolbox Functions
Compare the characteristics of Global Optimization Toolbox solvers to
Optimization Toolbox solvers.
Solver
fmincon, fminunc,
fseminf, lsqcurvefit,
lsqnonlin
fminsearch
fminbnd
All these Optimization Toolbox solvers:
• Have deterministic iterates
• Start from one user-supplied point
• Search just one basin of attraction
Why Are Some Solvers Objects?
GlobalSearch and MultiStart are objects. What does this mean for you?
Convergence
Proven quadratic convergence
to local optima for smooth
problems
No convergence proof —
counterexamples exist.
Proven convergence to local
optima for smooth problems,
slower than quadratic.
Characteristics
Deterministic iterates
Gradient-based
User-supplied starting point
Deterministic iterates
No gradients
User-supplied start point
Deterministic iterates
No gradients
User-supplied start point
Only one-dimensional problems
1-24
• You create a
problem.
• You can reuse the object for running multiple problems.
GlobalSearch and MultiStart objects are containers for algorithms and
•
global options. You use these objects to run a local solver multiple times.
The local solver has its own options.
For more information, see the Object-Oriented Programming documentation.
GlobalSearch or MultiStart object before running your
WritingFilesfor
Optimization Functions
• “Computing Objective Functions” on page 2-2
• “Constraints” on page 2-6
2
2 Writing Files for Optimization Functions
Computing Objective Functions
In this section...
“Objective (Fitness) Functions” on page 2-2
“Example: Writing a Function File” on page 2-2
“Example: Writing a Vectorized Function” on page 2-3
“Gradients and Hessians” on page 2-4
“Maximizing vs. Minimizing” on page 2-5
Objective (Fitness) Functions
To use Global Optimization Toolbox functions, you must first write a file (or
an anonymous function) that computes the function you want to optimize.
This function is called an objective function for most solvers o r a fitness
function for
number of independent variables, and should return a scalar. For vectorized
solvers, the function should accept a matrix (where each row represents one
input vector), and return a vector of objective function values. This section
shows how to write the file.
ga. The function should accept a v ector whose length is the
2-2
Example: Writing a Function File
Thefollowingexampleshowshowtowriteafileforthefunctionyouwantto
optimize. Suppose that you want to minimize the function
fxxxxxxx().=−+ + −
The file that computes this function must accept a vector x of length 2,
corresponding to the variables x
of the function at
1 Select New > Script (Ctrl+N) from the MATLAB File menu. This opens a
To check that the file returns the correct value, enter
my_fun([2 3])
ans =
31
Example: Writing a Vectorized Function
The ga and patternsearch solvers option ally compute the objective functions
of a collection of vectors in one function call. This method can take less time
than computing the objective functions of the vectors serially. This method
is called a vectorized function call.
To compute in vectorized fashion:
• Write your objective function to:
- Accept a m atrix with an arbitrary number of rows
- Return the vector of function values of each row
• If you have a nonlinear constraint, be sure to write the constraint in a
vectorized fashion. For details, see “Vectorized Constraints” on page 2 -7.
• Set the
set User function evaluation > Evaluate objective/fitness andconstraint functions to
patternsearch,alsosetCompletePoll to 'on'.Besuretopasstheoptions
structure to the solver.
For example, to write the objective function of “Example: Writing a Function
File” on page 2-2 in a vectorized fashion,
function z = my_fun(x)
z = x(:,1).^2 - 2*x(:,1).*x(:,2) + 6*x(:,1) + ...
To use my_fun as a vectorized objective function for patternsearch:
Vectorized option to 'on' with gaoptimset or psoptimset,or
For more information on writing vectorized functions for patternsearch,
see “Vectorizing the Objective and Constraint Functions” on page 4-72. For
more information on writing vectorized functions for
the Fitness Function” on page 5-81.
Gradients and Hessians
If you use GlobalSearch or MultiStart, your objective function can return
derivatives (gradient, Jacobian, or Hessian). For details on how to include
this syntax in your objective function, see “Writing Objective Functions” in
Optimization Toolbox documentation. Use
your solver uses the derivative information:
ga, see “Vectorizing
optimset to set options so that
2-4
Local Solver = fmincon, fminunc
Condition
Objective function contains gradient
Objective function contains Hessian
Constraint function contains
Option Setting
'GradObj' = 'on'
'Hessian' = 'on'
'GradConstr' = 'on'
gradient
Calculate Hessians of Lagrangian in
an extra function
'Hessian' = 'on', 'HessFcn' =
function handle
For more information about Hessians for fmincon, see “Hes sian”.
Local Solver = lsqcurvefit, lsqnonlin
Condition
Objective function contains Jacobian
Option Setting
'Jacobian' = 'on'
Computing Objective Functions
Maximizing vs. Minimizing
Global Optimization Toolbox optimization functions minimize the objective or
fitness function. That is, they solve problems of the form
min ( ).xfx
If you want to maximize f(x), minimize –f(x), because the point at which the
minimum of –f(x) occurs is the same as the point at which the maximum
of f(x) occurs.
For example, suppose you want to maximize the function
fxxxxxxx().=−+ + −
Write your
gxfxxxxxxx()(),=−=−+−−+
2
2643
1
121222
function file to compute
2
2643
1
121222
and minimize g(x).
2-5
2 Writing Files for Optimization Functions
Constraints
Many Global Optimization Toolbox functions accept bounds, linear
constraints, or nonlinear constraints. To see how to include these constraints
in your problem, see “Writing Constraints” in the Optimization Toolbox
documentation. Try consulting these pertinent links to sections:
• “Bound Constraints”
• “Linear Inequality Constraints” (linear equality constraints have the same
form)
• “Nonlinear Constraints”
Set Bounds
It is more important to set bounds for global solvers than for local solvers.
Global solvers use bounds in a variety of ways:
•
GlobalSearch requires bounds for its scatter-search point generation. If
you do not provide bounds,
by
-9999 and above by 10001. However, these bounds can easily be
inappropriate.
GlobalSearch bounds each component below
2-6
• If you do not provide bounds and do not provide custom start points,
MultiStart bounds each component below by -1000 and above by 1000.
However, these bounds can easily be inappropriate.
•
ga uses bounds and linear constraints for its initial population generation.
For unbounded problems,
as the upper bound for each dimension for initial point generation. For
bounded problems, and problems with linear constraints,
bounds and constraints to make the initial population.
•
simulannealbnd and patternsearch do not require bounds, although they
can use bounds.
ga uses a default of 0 as the lower bound and 1
ga uses the
Gradients and Hessians
If you use GlobalSearch or MultiStart with fmincon,yournonlinear
constraint functions can return derivatives (gradient or Hessian). For details,
see “Gradients and Hessians” on page 2-4.
Constraints
Vectorized Cons
The ga and patter
constraint func
can take less ti
serially. This
For the solver
your objectiv
see “Vectori
As an example
tions of a collection of vectors in one function call. This method
me than computing the objective functions of the vectors
method is called a vectorized function call.
to compute in a vectorized manner, you must vectorize both
e (fitness) function and nonlinear constraint function. For details,
zing the Objective and Constraint Functions” on page 4-72.
, suppose your nonlinear constraints for a three-dimensional
traints
nsearch
solvers optionally compute the nonlinear
problem are
2
2
xx
122
++≤
4925
The foll
assumin
vectors
function [c ceq] = nlinconst(x)
x
3
6
≥+
x
xxx
123
3
=
xx
cosh
()
12
2
.
owing code gives these nonlinear constraints in a vectorized fashion,
Optimization terminated: mesh size less than options.TolMesh
2-7
2 Writing Files for Optimization Functions
and constraint violation is less than options.TolCon.
x=
0.21910.750012.1712
fval =
-148.7480
Using ga:
options = gaoptimset('Vectorized','on');
[x fval] = ga(@vfun,3,[],[],[],[],[],[],@nlinconst,options)
Optimization terminated: maximum number of generations exceeded.
x=
-1.4098-0.121611.6664
fval =
-138.1066
2-8
For this problem patternsearch computes the solution far more quickly
and accurately.
UsingGlobalSearchand
MultiStart
• “How to Optimize with GlobalSearch and MultiStart” on page 3-2
• “Examining Results” on page 3-23
• “How GlobalSearch and MultiStart Work” on page 3-35
• “Improving Results” on page 3-45
• “GlobalSearch and MultiStart Examples” on page 3-63
3
3 Us ing GlobalSearch and MultiStart
How to Optimize with GlobalSearch and MultiStart
In this section...
“Problems You Can Solve with GlobalSearch and MultiStart” on page 3-2
“Outline of Steps” on page 3-2
“Determining Inputs for the Problem” on page 3-4
“Create a Problem Structure” on page 3-4
“Create a Solver Object” on page 3-13
“Set Start Points for M ultiStart” on page 3-16
“Run the Solver” on page 3-19
Problems You Can Solve with GlobalSearch and
MultiStart
The GlobalSearch and MultiStart solvers apply to problems with smooth
objective and constraint functions. The solvers search for a global minimum,
or for a set of local minima. For more information on which solver to use,
see “Choosing a Solver” on page 1-17.
3-2
GlobalSearch and MultiStart work by starting a local solver, such as
fmincon, from a variety of start points. Generally the start points are random.
However, for
information, see “How GlobalSearch a n d MultiStart Work” on page 3-35.
To find out how to use these solvers, see “Outline of Steps” on p age 3-2. For
examples using these solvers, see the example index.
MultiStart you can provide a set of start points. For more
Outline of Steps
To find a global or multiple local solutions:
1 “Create a Problem Structure” on p age 3-4
2 “Create a Solver Object” on page 3-13
Optional,
3 (
MultiStart only) “Set Start Points for MultiStart” on page 3-16
How to Opti m ize with GlobalSearch and MultiStart
4 “Run the Solver” on page 3-19
The following figure illustrates these steps.
Information
you have
Command
to use
Resulting
Object or
Structure
Local Solver
Objective
Constraints
X0
Local Options
createOptimProblem
Problem Structure
Global options
GlobalSearch
or
MultiStart
Solver Object
run
Optional,
MultiStart only
Start Points
RandomStartPointSet
or
CustomStartPointSet
Start Points Object
Results
3-3
3 Us ing GlobalSearch and MultiStart
Determining Inputs for the Problem
A problem structure defines a local optimization problem using a local solver,
such as
or p r ep aring the inputs is in the Optimizatio n Toolbox documentation.
Required Inputs
fmincon. T herefore, most of the documentationonchoosingasolver
Input
Local solver
Objective function“Computing Objective Functions” on page 2-2
Start point x0
Optional Inputs
Input
Constraint functions“Constraints” on page 2-6
Local options
structure
Link to Documentation
“Optimization Decision Table” in the Optimization
Toolbox documentation.
Link to Documentation
“Setting Options”
Create a Problem Structure
To use the GlobalSearch or MultiStart solvers, you must first create a
problem structure. There are two ways to create a problem structure:
• “Using the createOptimProblem Function” on page 3-4
• “Exporting from the Optimization Tool” on page 3-8
3-4
Using the createOptimProblem Function
Follow these steps to create a problem structure using the
createOptimProblem function.
1 Define your objective function as a file or anonymous function. For
details, see “Computing Objective Functions” on page 2-2. If your solver
How to Opti m ize with GlobalSearch and MultiStart
is lsqcurvefit or lsqnonlin, ensure the objective function returns a
vector, not scalar.
2 If relevant, create your constraints, such as bounds and nonlinear
constraint functions. For details, see “Constraints” on page 2-6.
3 Create a start point. For example, to create a three-dimens ional random
start point
xstart = randn(3,1);
4 (Optional) Create an options structure using optimset. For example,
options = optimset('Algorithm','interior-point');
5 Enter
problem = createOptimProblem(solver,
xstart:
where sol ver is the name of your local solver:
• For
GlobalSearch: 'fmincon'
• For MultiStart the choices are:
'fmincon'
–
– 'fminunc'
– 'lsqcurvefit'
– 'lsqnonlin'
For help choosing, see “Optimization Decision Table”.
6 Set an initial point using the 'x0' parameter. If your initial point is
xstart,andyoursolverisfmincon, your entry is now
problem = createOptimProblem('fmincon','x0',xstart,
7 Include the function handle for your objective function in objective:
problem = createOptimProblem('fmincon','x0',xstart, ...
'objective',@
8 Set bounds and other constraints as applicable.
objfun,
3-5
3 Us ing GlobalSearch and MultiStart
Constraint
lower bounds
upper bounds
matrix Aineq for linear inequalities
Aineq x ≤ bineq
vector bineq for linear inequalities
Aineq x ≤ bineq
matrix Aeq forlinearequalitiesAeq x = beq
vector beq for linear equalities Aeq x = beq
nonlinear constraint function
9 If using the lsqcurvefit local solver, include vectors of input data and
response data, named
10 Best practice: validate the problem structure by running your solver on the
structure. For example, if your local solver is
[x fval eflag output] = fmincon(problem);
'xdata' and 'ydata' respectively.
fmincon:
Name
'lb'
'ub'
'Aineq'
'bineq'
'Aeq'
'beq'
'nonlcon'
Note Specify fmincon as the solver for GlobalSearch,evenifyouhave
no constraints. However, you cannot run
fmincon on a problem without
constraints. Add an artificial constrainttotheproblemtovalidatethe
structure:
3-6
problem.lb = -Inf;
Example: Creating a Problem Structure with createOptimProblem.
This example minimizes the function from “Run the Solver” on page 3-19,
subject to the constraint x
sixmin = 4x
Use the
[2;3].
2
–2.1x4+ x6/3 + xy –4y2+4y4.
interior-point algorithm of fmincon, and set the start point to
+2x2≥ 4. The objective is
1
How to Opti m ize with GlobalSearch and MultiStart
1 Write a function handle for the objective function.
2 Write the linear constraint matrices. Chan ge the constraint to “less than”
2
–2.1x4+ x6/3 + xy –4y2+4y4.
interior-point algorithm of fmincon, and set the start point to
+ x(1)*x(2) - 4*x(2)^2 + 4*x(2)^4);
form:
A = [-1,-2];
b = -4;
3 Launch the Optimization Tool by entering optimtool at the MATLAB
command line.
4 Set the solver, algorithm, objective, start point, and constraints.
3-11
3 Us ing GlobalSearch and MultiStart
3-12
5 Best practice: run the problem to verify the setup.
The problem runs successfully.
How to Opti m ize with GlobalSearch and MultiStart
6 Choose File > Export to Workspace and select Export problem and
options to a MATLAB structure named
Create a Solver Object
A solver object contains your preferences for the global portion of the
optimization.
You d o not need to set any prefe rences. Create a
gs with default settings as follows:
gs = GlobalSearch;
GlobalSearch object named
3-13
3 Us ing GlobalSearch and MultiStart
Similarly, create a MultiStart object named ms with default settings as
follows:
ms = MultiStart;
Properties (Global Options) of Solver Objects
Global optio n s are properties of a GlobalSearch or MultiStart object.
Properties for both GlobalSearch and MultiS ta rt
Property Name
Display
TolFun
TolX
MaxTime
StartPointsToRun
Meaning
Detail level of iterative display. Set to 'off' for no
display,
run, or
'final' (default) for a report at the end of the
'iter' for reports as the solver progresses.
For more information and examples, see “Iterative
Display” on page 3-28.
Solvers consider objective function values within
TolFun of each other to be identical (not distinct).
Default:
solutions satisfy both
1e-6. Solvers group solutions when the
TolFun and TolX tolerances.
Solvers consider solutions within TolX distance of each
othertobeidentical(notdistinct). Default:
1e-6.
Solvers group solutions when the solutions satisfy both
TolFun and TolX tolerances.
Solvers halt if the run exceeds MaxTime seconds, as
measured by a clock (not processor seconds). Default:
Inf
Choose whether to run 'all' (default) start points,
only thos e points that satisfy
'bounds',oronlythose
points that are feasible with respect to bounds and
inequality constraints with
'bounds-ineqs'.Foran
example, see “Example: Using Only Feasible Start
Points” on page 3-70.
3-14
Properties for GlobalSearch
How to Opti m ize with GlobalSearch and MultiStart
Property Name
NumTrialPoints
BasinRadiusFactor
DistanceThresholdFactor
MaxWaitCycle
NumStageOnePoints
PenaltyThresholdFactor
Meaning
Number of trial points to use examine.
Default:
1000
See “Properties” on page 12-30 for detailed
descriptions of these properties.
Properties for MultiStart
Property Name
UseParallel
Meaning
When 'always', MultiStart attempts to distribute
start points to multiple processors for the local solver.
Disable by setting to
'never' (default). For details,
see“HowtoUseParallelProcessing” on page 8-11.
For an example, see “Example: Parallel MultiStart”
on page 3-74.
Example: Creating a Nondefault GlobalSearch Object
Suppose you want to solve a problem and:
• Consider local solutions identical if they are within 0.01 of each other and
the function values are within the default
TolFun tolerance.
• Spend no more than 2000 seconds on the computation.
To solve the problem, create a
gs = GlobalSearch('TolX',0.01,'MaxTime',2000);
GlobalSearch object gs as follows:
Example: Creating a Nondefault MultiStart O bject
Suppose you want to solve a problem such that:
3-15
3 Us ing GlobalSearch and MultiStart
• You consider local solutions identical if they are within 0.01 of each other
and the function values are within the default
• You spend no more than 2000 seconds on the computation.
TolFun tolerance.
To solve the problem, create a
ms = MultiStart('TolX',0.01,'MaxTime',2000);
MultiStart object ms as follows:
Set Start Points for MultiStar t
There are four ways you tell MultiStart which start points to use for the
local solver:
• Pass a positive integer
as if using a
MultiStart also uses the x0 start point from the problem structure, for a
total of
• Pass a
• Pass a
RandomStartPointSet object and the problem structure.
k start points.
RandomStartPointSet object.
CustomStartPointSet object.
• Pass a cell array of
objects. P ass a cell array if you have some specific points you want to run,
but also want
MultiStart to use other random start points.
Note You can control whether MultiStart uses all start points, or only
those points that satisfy bounds or other inequality constraints. For more
information, see “Filter Start Points (Optional)” on page 3-42.
k. MultiStart generates k-1start points
RandomStartPointSet and CustomStartPointSet
3-16
Positive Integer for Start Points
The syntax for running MultiStart for k start points is
To use a specific set of starting points, package them in a
CustomStartPointSet as follows:
1 Place the starting points in a matrix. Each row of the matrix represents
one starting point.
filtering with the
“MultiStart Algorithm” on page 3-41.
2 Create a CustomStartPointSet object from the matrix:
tpoints = CustomStartPointSet(ptmatrix);
MultiStart runs all the rows of the matrix, subject to
StartPointsToRun property. For more information, see
For example, create a set of 40 five-dimensional points, with e ach component
of a point equal to 10 plus an exponentially distributed variable with mean 25:
To get the origin a l matrix of points from a CustomStartPointSet object,
use the
pts = list(tpoints); % Assumes tpoints is a CustomStartPointSet
A CustomStartPointSet has two properties: DimStartPoints
and NumStartPoints. You can use these properties to query a
CustomStartPointSet object. For example, the tpoints object in the example
has the following properties:
tpoints.DimStartPoints
ans =
tpoints.NumStartPoints
ans =
list method:
5
40
3-18
Cell Array of Objects for Start Points
To use a specific set of starting points along with some randomly generated
points, pass a cell array of
objects.
Forexample,touseboththe40specificfive-dimensionalpointsof
“CustomStartPointSet Object for Start Points” on p age 3-17 and 40 additional
five-dimensional points from
To find several local minima of the problem described in “Run the Solver” on
page 3-19 using 50 runs of
fmincon with MultiStart,enter:
How to Opti m ize with GlobalSearch and MultiStart
% % Set the default stream to get exactly the same output
% s = RandStream('mt19937ar','Seed',14);
% RandStream.setDefaultStream(s);
ms = MultiStart;
opts = optimset('Algorithm','interior-point');
sixmin = @(x)(4*x(1)^2 - 2.1*x(1)^4 + x(1)^6/3 ...
+ x(1)*x(2) - 4*x(2)^2 + 4*x(2)^4);
problem = createOptimProblem('fmincon','x0',[-1,2],...
In this case, MultiStart located all six local minima, while GlobalSearch
located four. For pictures of the MultiStart solutions, see “Example:
Visualizing the Basins of Attraction” on page 3-32.
Output
X0
3-22
Examining Results
In this section...
“Single Solution” on page 3-23
“Multiple Solutions” on page 3-24
“Iterative Display” on page 3-28
“Global Output Structures” on page 3-31
“Example: Visualizing the Basins of Attraction” on page 3-32
Single Solution
You o btain the single best solution found during the run by calling run with
the syntax
[x fval eflag output] = run(...);
• x is the location of the local minimum with smallest objective function
value.
Examining Results
•
fval is the objective function value evaluated at x.
eflag is an exit flag for the global solver. Values:
•
Global Solver Exit Flags
2
At least one local minimum found. Some ru ns of the local solver
converged (had positive exit flag).
1
At least one local minimum found. All runs of the local solver
converged (had positive exit flag).
0
No l ocal m in imum found. Local solver called at least once, and
at lea st one local solver exceeded the
tolerances.
-1
One or more local solver runs stopped by the local solver output
function or plot function.
-2
-5
No feasible local m inimum found.
MaxTime
limit exceeded.
MaxIter or MaxFunEvals
3-23
3 Us ing GlobalSearch and MultiStart
Global Solver Exit Flags (Continued)
-8
-10
•
output is a structure with details about the multiple runs of the local
No solution found. All runs had local solver exit flag -2 or
smaller, not all equal
-2.
Failures encountered in user-provided functions.
solver. For more information, see “Global Output Structures” on page 3-31.
The list of outputs is for the case
eflag > 0.Ifeflag <= 0,thenx is the
following:
• If some local solutions are feasible,
x represents the location of the lowest
objective function value. “Feasible” means the constraint violations are
smaller than
• If no solutions are feasible,
• If no solutions exist,
problem.options.TolCon.
x is the solution with lowest infeasibility.
x, fval,andoutput are empty ([]).
Multiple Solutions
You obtain multiple solutions in an object by calling run with the syntax
[x fval eflag output manymins] = run(...);
manymins
vector is in order of objectiv e function value, from lowest (best) to highest
(worst). Each solution object contains the following properties (fields):
is a vector of solution objects; see GlobalOptimSolution.The
3-24
•
X — a local minimum
Fval — the value of the objective function at X
•
• Exitflag — the exit flag for the global solver (described here)
Output — an output structure (described here)
•
X0 — a cell array of start points that led to the solution point X
•
There a re several ways to examine the vector of solution objects:
Examining Results
• In the MATLAB Workspace Browser. Double-click the solution object, and
then double-click the resulting display in the Variable Editor.
• Using dot addressing. GlobalOptimSolution properties are capitalized.
Use proper capitalization to access the properties.
For example, to find the vector of function values, enter:
fcnvals = [manymins.Fval]
fcnvals =
3-25
3 Us ing GlobalSearch and MultiStart
To get a cell array of all the start points that led to the lowest function
value(thefirstelementof
smallX0 = manymins(1).X0
• Plot some field values. For example, to see the range of resulting Fval,
enter:
hist([manymins.Fval])
This results in a histogram of the computed function values. (The figure
shows a histogram from a different example than the previous few figures.)
-1.0316-0.21550
manymins), enter:
3-26
Example: Changing the D efinition of Distinct Solutions
You might find out, after obtaining multiple local solutions, that your
tolerances were not appropriate. You can have many more local solutions
than you want, spaced too closely together. Or you can have fewer solutions
Examining Results
than you want, with GlobalSearch or MultiStart clumping together too
many solutions.
To deal with this situation, run the solver again with different tolerances. The
TolX and TolFun tolerances determ ine how the solvers group their outputs
into the
GlobalSearch or MultiStart object.
GlobalOptimSolution vector. These tolerances are properties of the
For example, suppose you want to use the
active-set algorithm in fmincon
to solve the problem in “Example of Run with M ultiStart” on page 3-20.
Further suppose that you want to have toler an ces of
TolFun.Therun method groups local solutions w hose objective function
values are within
TolFun of each other, and which are also less than TolX
MultiStart completed the runs from all start points.
All 50 local solver runs converged with a
positive local solver exit flag.
someminsm
someminsm =
1x6 GlobalOptimSolution
Properties:
X
Fval
Exitflag
Output
3-27
3 Us ing GlobalSearch and MultiStart
In this case, MultiStart generated six distinct solutions. Here “distinct”
means that the solutions are more than 0.01 apart in either objective function
value or location.
Iterative Display
Iterative display gives you information about the progress of solvers during
their runs.
There are two types of iterative display:
• Global solver display
• Local solver display
Both types appear at the command line, depending on global and local options.
X0
Obtain local solver iterative display by setting the
problem.options structure to 'iter' or 'iter-detailed' with optimset.
For more information, see “Displaying Iterative Output” in the Optimization
Toolbox documentation.
Obtain global solver iterative display by setting the
GlobalSearch or MultiStart object to 'iter'.
Global solvers set the default
unless the problem structure has a value for this option. Global solvers do
not override any setting you make for local options.
Note Setting the local solver Display option to anything other than 'off'
can produce a great deal of output. The default Display option created by
Local minimum found that satisfies the constraints.
Optimization completed because the objective function is non-decreasing in
feasible directions, to within the selected value of the function tolerance,
and constraints were satisfied to within the selected value of the constraint tolerance.
First-orderNorm of
3-30
<stopping criteria details>
First-orderNorm of
Iter F-countf(x) Feasibilityoptimalitystep
03 -5.372419e-0010.000e+0001.913e+000
... MANY ITERATIONS DELETED ...
939 -2.154638e-0010.000e+0002.425e-0076.002e-008
Local minimum found that satisfies the constraints.
Optimization completed because the objective function is non-decreasing in
feasible directions, to within the selected value of the function tolerance,
and constraints were satisfied to within the selected value of the constraint tolerance.
<stopping criteria details>
GlobalSearch stopped because it analyzed all the start points.
All 7 local solver runs converged with a positive local solver exit flag.
Examining Results
Setting GlobalSearch itera tive di s play, a s well as fmincon iterative display,
yields both displays intermingled.
For an example of iterative display in a parallel environment, see “Example:
Parallel MultiStart” on page 3-74.
Global Output Structures
run can produce two types of output structures:
• A global output structure. This structure contains information about the
overall run from multiple starting poin ts. Details follow.
• Local solver output structures. The vector of
contains one such structure in each element of the vector. For a description
of this structure, see “Output Structures” in the Optimization Toolbox
documentation, or the function reference pages for the local solvers:
fmincon, fminunc, lsqcurvefit,orlsqnonlin.
Global Output Structure
Field
funcCount
localSolverTotal
localSolverSuccess
localSolverIncomplete
localSolverNoSolution
message
A p osi t ive exi t flag from a local solver generally indicates a successful run. A
negative exit flag indicates a failure. A
stopped by exceeding the iteration or function evaluation limit. For more
information, see “Exit Flags and Exit Messages” or “Tolerances and Stopping
Criteria” in the Optimization Toolbox documentation.
GlobalOptimSolution objects
Meaning
Total number of calls to user-supplied functions (objective or
nonlinear constraint)
Number of local solver runs started
Number of local solver runs that finished with a positive exit flag
Number of local solver runs that finished with a 0 exit flag
Number of local solver runs that finished with a negative exit
flag
GlobalSearch
or MultiStart exit message
0 exit flag indicates that the solver
3-31
3 Us ing GlobalSearch and MultiStart
Example: Visualizing the Basins of Attraction
Which start points lead to which basin? For a steepest descent solver, nearby
points generally lead to the same basin; see “Basins of Attraction” on page
1-13. However, for Optimization Toolbo x solvers, basins are more complicated.
Plot the
MultiStart start points from the example, “Example of Run with
MultiStart” on page 3-20, color-coded with the basin where they end.
% s = RandStream('mt19937ar','Seed',14);
% RandStream.setDefaultStream(s);
% Uncomment the previous lines to get the same output
ms = MultiStart;
opts = optimset('Algorithm','interior-point');
sixmin = @(x)(4*x(1)^2 - 2.1*x(1)^4 + x(1)^6/3 ...
+ x(1)*x(2) - 4*x(2)^2 + 4*x(2)^4);
problem = createOptimProblem('fmincon','x0',[-1,2],...
'objective',sixmin,'lb',[-3,-3],'ub',[3,3],...
'options',opts);
[xminm,fminm,flagm,outptm,manyminsm] = run(ms,problem,50);
possColors = 'kbgcrm';
hold on
for i = 1:size(manyminsm,2)
% Color of this line
cIdx = rem(i-1, length(possColors)) + 1;
color = possColors(cIdx);
Start points do not always converge to the closest basin. For example, the red
pointsareclosertothegreenbasincenterthantotheredbasincenter.Also,
many black and blue start points are closer to the opposite basin centers.
The magenta and red basins are shallow, as you can see in the following
contour plot.
3-33
3 Us ing GlobalSearch and MultiStart
3-34
How GlobalSearch and MultiStart Work
In this section...
“Multiple Runs of a Local Solver” on page 3-35
“Differences Between the Solver Objects” on page 3-35
“GlobalSearch Algorithm” on page 3-37
“MultiStart Algorithm” on page 3-41
“Bibliography” on page 3-44
Multiple Runs of a Local Solver
GlobalSearch and MultiStart have similar approaches to finding global or
multiple minima. Both algorithms start a local solver (such as
multiple start points. The algorithms use multiple start points to sample
multiple b asins of attraction. For m ore information, see “Basins of Attraction”
on page 1-13.
How GlobalSearch and MultiStart Work
fmincon)from
Differences Between the Solver Objects
GlobalSearch and MultiStart Algorithm Overview on page 3-36 contains a
sketch of the
GlobalSearch and MultiStart algorithms.
3-35
3 Us ing GlobalSearch and MultiStart
GlobalSearch AlgorithmMultiStart Algorithm
Run fmincon from x0
Generate trial points
(potential start points)
Stage 1:
Create GlobalOptimSolutions vector
Generate start points
Run start points
Run best start point among the first
NumStageOnePoints trial points
Stage 2:
Loop through remaining trial points,
run fmincon if point satisfies
basin, score, and constraint filters
Create GlobalOptimSolutions vector
GlobalSearch and MultiStart Algorithm Overview
The main differences between GlobalSearch and MultiStart are:
•
GlobalSearch uses a scatter-search mechanism for generating start points.
MultiStart uses uniformly distributed start points within bounds, or
user-supplied start points.
•
GlobalSearch analyzes start points and rejects those points that are
unlikely to improve the best local minimum found so far.
all start points (or, optionally, all start points that are feasible with respect
to bounds or inequality constraints).
MultiStart runs
3-36
•
MultiStart gives a choice of local solver: fmincon, fminunc, lsqcurvefit,
or
lsqnonlin.TheGlobalSearch algorithm uses fmincon.
MultiStart can run in parallel, distributing start points to mult ip le
•
processors for local solution. To run
Use Parallel Processing” on page 8-11.
MultiStart in parallel, see “How to
How GlobalSearch and MultiStart Work
Deciding Which Solver to Use
The differences between these solver objects boil down to the following
decision on which to use:
• Use
• Use
GlobalSearch to find a single globa l minimum most efficiently on a
single processor.
MultiStart to:
- Find multiple local minima.
- Run in parallel.
- Use a solver other than fmincon.
- Search thoroughly for a global minimum.
- Explore your own start points.
GlobalSearch Algorithm
For a description of the algorithm, see Ugray et al. [1].
When you
steps:
• “Run fmincon from x0” on page 3-38
• “Generate Trial Points” on page 3-38
• “Obtain Stage 1 Start Point, Run” on page 3-38
• “Initialize Basins, Counters, Threshold” on page 3-38
run a GlobalSearch object, the algorithm performs the following
• “Begin Main Loop” on page 3-39
• “Examine Stage 2 Trial Point to See if fmincon Runs” on page 3-39
• “When fmincon Runs:” on page 3-39
• “When fmincon Does Not Run:” on page 3-40
• “Create GlobalOptimSolution” on page 3-41
3-37
3 Us ing GlobalSearch and MultiStart
Run fmincon from x0
GlobalSearch runs fmincon from the start point you give in the problem
structure. If this run converges, GlobalSearch records the start point
and end point for an initial estimate on the radius of a basin of attraction.
Furthermore,
in the score function (see “Obtain Stage 1 Start Point, Run” on page 3-38).
The score function is the sum of the objective function value at a point and a
multiple of the sum of the constraint violations. So a feasible point has score
equal to its objective function value. The multiple for constraint violations is
initially 1000.
Generate Trial Points
GlobalSearch uses the scatter search algorithm to generate a set of
NumTrialPoints trial points. Trial points are potential start points. For a
description of the scatter search algorithm, see Glover [2]. The trial points
have components in the range (–1e4+1,1e4+1). This range is not symmetric
about the origin so that the origin is not in the scatter search.
GlobalSearch records the final objective function value for use
GlobalSearch updates the multiple during the run.
3-38
Obtain Stage 1 Start Point, Run
GlobalSearch evaluates the score function of a set of NumStageOnePoints
trial points. It then takes the point with the best score and runs fmincon
from that point. GlobalSearch removes the set of NumStageOnePoints trial
points from its list of points to examine.
Initialize Basins, Counters, Threshold
The localSolverThreshold is initially the smaller of the two objective
function values at the solution points. The solution points are the
solutions starting from x0 and from the Stage 1 start point. If both of these
solution points do not exist or are infeasible,
initially the penalty function value of the Stage 1 start point.
The
GlobalSearch heuristic assumption is that basins of attraction are
spherical. The initial estimate of basins of attraction fo r the solution point
from
x0 and the solution point from Stage 1 are spheres centered at the
solution points. The radius of each sphere is the distance from the initial
point to the solution point. These estimated basins can overlap.
localSolverThreshold is
fmincon
How GlobalSearch and MultiStart Work
There a re two sets of counters associated with the alg orithm. Each counter is
the number of consecutive trial points that:
• Lie within a basin of attraction. There is one counter for each basin.
• Have score function greater than
localSolverThreshold. For a definition
of the score, see “Run fmincon from x0” on page 3-38.
All counters are initially 0.
Begin Main Loop
GlobalSearch repeatedly examines a remaining trial point from the list, and
performs the following steps. It continually monitors the time, and stops the
search if elapsed time exceeds
MaxTime seconds.
Examine Stage 2 Trial Point to See if fmincon Runs
Call the trial point p.Runfmincon from p if the following conditio ns hold:
•
p is not in any existing basin. The criterion for every basin i is:
radius is an estimated radius that updates in Update Basin Radius and
Threshold and React to Large Counter Values.
• score(
• (optional)
p)<localSolverThreshold.
p satisfies bound and/or inequality constraints. This test occurs
if you set the
'bounds' or 'bounds-ineqs'.
StartPointsToRun property of the GlobalSearch object to
is an option (default value 0.75).
When fmincon Runs:
1 Reset CountersSet the counters for basins and threshold to 0.
2 Update Solution SetIf fmincon runs starting from p, it can yield a positive
exit flag, which indicates convergence. In that case,
the vector of
the objective function value
GlobalOptimSolution objects. Call the solution point xp and
fp. There are two cases:
GlobalSearch updates
3-39
3 Us ing GlobalSearch and MultiStart
• For every other solution point xq with objective function value fq,
|xq - xp| > TolX * max(1,|xp|)
or
|fq - fp| > TolFun * max(1,|fp|).
In this case,
GlobalOptimSolution objects. For details of the information contained
in each object, see
• For some other solution point
|xq - xp| <= TolX * max(1,|xp|)
GlobalSearch creates a new element in the vector of
GlobalOptimSolution.
xq with objective function value fq,
and
|fq - fp| <= TolFun * max(1,|fp|).
In this case,
GlobalSearch algorithm modifies the GlobalOptimSolution of xq by
adding
GlobalSearch regards xp as equivalent to xq.The
p to the cell array of X0 points.
There is one minor tweak that can happen to this update. If the exit flag
for
xq is greater than 1,andtheexitflagforxp is 1,thenxp replaces xq.
This replacement can lead to some points in the same basin being more
than a distance of
3 Update Basin Radius and ThresholdIf the exit flag of the current fmincon
TolX from xp.
run is positive:
a Set threshold to the score value at start point p.
b Set basin radius for xp equal to the maximum of the existing radius (if
any) and the distance between
p and xp.
3-40
4 Report to Iterative DisplayWhen the GlobalSearch Display property is
'iter',everypointthatfmincon runs creates one line in the GlobalSearch
iterative display.
When fmincon Does Not Run:
1 Update CountersIncrement the counter for every basin containing p.Reset
the counter of every other basin to
0.
How GlobalSearch and MultiStart Work
Increment the threshold counter if score(p)>=localSolverThreshold.
Otherwise, reset the counter to
2 React to Large Counter ValuesFor each basin with counter equal to
MaxWaitCycle, multiply the basin radius by 1 – BasinRadiusFactor.Reset
the counter to
properties of the
0.(BothMaxWaitCycle and BasinRadiusFactor are settable
GlobalSearch object.)
0.
If the threshold counter equals
new threshold = threshold +
MaxWaitCycle, increase the threshold:
PenaltyThresholdFactor*(1 +
abs(threshold)).
Reset the counter to
3 Report to Iterative DisplayEvery 200th trial point creates one line in the
GlobalSearch iterative display.
0.
Create GlobalOptimSolution
After reaching MaxTime seconds or running out of trial points, GlobalSearch
creates a vector of GlobalOptimSolution objects. GlobalSearch orders the
vector by objective function value, from lowest (best) to highest (worst). This
concludes the algorithm.
MultiStart Algorithm
When you run a MultiStart object, the algorithm performs the following
steps:
• “Generate Start Points” on page 3-41
• “Filter Start Points (Optional)” on page 3-42
• “Run Local Solver” on page 3-42
• “Check Stopping Conditions” on page 3-43
• “Create GlobalOptimSolution Object” on page 3-43
Generate Start Points
If you call MultiStart with the syntax
3-41
3 Us ing GlobalSearch and MultiStart
[x fval] = run(ms,problem,k)
for an integer k, MultiStart generates k-1start points exactly as if you
used a
point from the
A
RandomStartPointSet object does not have any points stored inside
the object. Instead,
random points within the bounds given by the
unbounded component exists,
ArtificialBound property of the RandomStartPointSet object.
RandomStartPointSet object. The algorithm also uses the x0 start
problem structure, for a total of k start points.
MultiStart calls the list method, which generates
problem structure. If an
list uses an artificial bound given by the
If you provide a
CustomStartPointSet object, MultiStart does not generate
start points, but uses the points in the object.
FilterStartPoints(Optional)
If you set the StartPointsToRun property of the MultiStart object to
'bounds' or 'bounds-ineqs', MultiStart do es not run the local solver from
infeasible start points. In this context, “infeasible” means start points that
do not satisfy bounds, or start points that do not satisfy both bounds and
inequality constraints.
The default setting of
StartPointsToRun is 'all'.Inthiscase,MultiStart
does not discard infeasible start points.
Run Local Solver
MultiStart runs the local solver specified in problem.solver,startingatthe
points that pass the
parallel, it sends start points to worker processors one at a time, and the
worker processors run the local solver.
The local solver checks whether
iterations. If so, it exits that iteration without reporting a solution.
When the lo cal solver stops,
to the next step.
StartPointsToRun filter. If MultiStart is running in
MaxTime seconds have elapsed at each of its
MultiStart stores the results and continues
3-42
How GlobalSearch and MultiStart Work
Report to Iterative Display. When the MultiStart Display property
is
'iter', every point that the local solver runs creates one line in the
MultiStart iterative display.
Check Stopping Conditions
MultiStart stops when it runs out of start points. It also stops w he n it
exceeds a total run time of
MaxTime seconds.
Create GlobalOptimSolution Object
After MultiStart reaches a stopping condition, the algorithm creates a vector
of
GlobalOptimSolution objectsasfollows:
1 Sort the local solutions by objective function value (Fval) from lowest to
highest. For the
function is the norm of the residual.
2 Loop over the local solutions j beginning with the lowest (best) Fval.
lsqnonlin and lsqcurvefit local solvers, the objective
3 Find all the solutions k satisfying both:
|Fval(k) - Fval(j)| <= TolFun*max(1,|Fval(j)|)
|x(k) - x(j)| <= TolX*max(1,|x(j)|)
4 Record j, Fval(j),thelocalsolveroutput structure for j,andacell
array of the start points for
j and all the k.Removethosepointsk
from the list of local solutions. This point is one entry in the vector of
GlobalOptimSolution objects.
The resulting vector of
GlobalOptimSolution objects is in order by Fval,
from lowest (best) to highest (worst).
Report to Iterative Display. After examining all the local solutions,
MultiStart gives a summary to the iterative display. This summary includes
the number of local solver runs that converged, the number that failed to
converge, and the number that had errors.
3-43
3 Us ing GlobalSearch and MultiStart
Bibliography
[1] Ugray, Zsolt
and Rafael Mart
Framework for G
19, No. 3, 2007
[2] Glover, F.
Evolution (J
Lecture Note
pp. 13–54.
[3] Dixon, L
Introducti
Szegö, eds
“A template for scatter search and path relinking.” Artificial
s in Computer Science, 1363, Springer, Berlin/Heidelberg, 1998,
. and G. P. Szegö. “The Global Optimization Problem: an
on.” Towards Global Optimisation 2 (Dixon,L.C.W.andG.P.
.). Amsterdam, The Netherlands: North Holland, 1978.
, Leo n Lasdon, John C. Plummer, Fred Glover, James Kelly,
í. Scatter Search and Local NLP Solvers: A Multistart
lobal Optimization. INFORMS Journal on Computing, Vol.
, pp. 328–340.
3-44
Improving Results
In this section...
“Can You Certify a Solution Is Global?” on page 3-45
“Refining the Start Points” on page 3-48
“Changing Options” on page 3-55
“Reproducing Results” on page 3-59
Can You Certify a Solution Is Global?
How can you tell if you have located the global minimum of your objective
function? The short answer is that you cannot; you have no guarantee that
the result of a Global Optimization Toolbox solver is a global optimum.
However, you can use the following strategies for investigating solutions:
• “Check if a Solution Is a Local Solutio n with patternsearch” on page 3-45
Improving Results
• “Identify a Bounded Region That Contains a Global Solution” on page 3-46
• “Use MultiStart with More Start Points” on page 3-47
Check if a Solution Is a Local Solution with patternsearch
Before you can determine if a purported solution is a global minimum, first
check that it is a local minimum. To do so, run
To convert the
enter
problem.solver = 'patternsearch';
Also, change the start point to the solution you just found:
problem.x0 = x;
For example, Check Nearby Points (in the Optimization Toolbox
documentation) shows the following:
options = optimset('Algorithm','active-set');
problem to use patternsearch instead of fmincon or fminunc,
patternsearch on the problem.
3-45
3 Us ing GlobalSearch and MultiStart
ffun = @(x)(x(1)-(x(1)-x(2))^2);
problem = createOptimProblem('fmincon', ...
[x fval exitflag] = fmincon(problem)
x=
1.0e-007 *
fval =
-2.6059e-016
exitflag =
However, checking this purported solution with patternsearch shows that
there is a better solution. Start
% set the candidate solution x as the start point
problem.x0 = x;
problem.solver = 'patternsearch';
[xp fvalp exitflagp] = patternsearch(problem)
Identify a Bounded Region That Contains a G lobal Solution
Suppose you have a smooth objective function in a bounded region. Given
enough time and start points,
Therefore, if you can bound the region where a global solution can exist, you
can obtain some degree of assurance that
solution.
MultiStart eventually locates a global solution.
MultiStart locates the global
Improving Results
Forexample,considerthefunction
2
6622
fx yxyx y
()
⎛
−
⎜
⎜
⎝
⎞
x
2
y
+
1
4224
()
+++
xxyy=++++
2sin()cos.
⎟
⎟
⎠
The initial summands x6+ y6force the function to become large and positive
for large values of |x|or|y|. The components of the global minimum of the
function must be within the bounds
–10 ≤ x,y ≤ 10,
since 10
6
is much larger than all the multiples of 104that occur in the other
summands of the function.
You can identify smaller bounds for this problem; for example, the global
minimum is between –2 and 2. It is more important to identify reasonable
bounds than it is to identify the best bounds.
Use MultiStart with More Start Points
To check whether there is a better solution to your problem, run MultiStart
with additional start points. Use MultiStart instead of GlobalSearch for this
task because
For example, see “Example: Searching for a Better Solution” on page 3-52.
Updating Unconstrained Problem from GlobalSearch. If you use
GlobalSearch on an unconstrained problem, change your problem structure
before using
structure for an unconstrained problem using MultiStart:
• Change the
problem.solver = 'fminunc';
To avoid a warning if your o bjective function does not compute a gradient,
change the local
GlobalSearch does not run the local solver from all start points.
MultiStart. You have two choices in updating a problem
solver field to 'fminunc':
options structure to have LargeScale set to 'off':
problem.options.LargeScale = 'off';
3-47
3 Us ing GlobalSearch and MultiStart
• Add an artificial constraint, retaining fmincon as the local solver:
problem.lb = -Inf;
To search a larger region than the default, see “Refining the Start Points”
on page 3-48.
Refining the Start Points
If some components of your problem are unconstrained, GlobalSearch and
MultiStart use artificial bounds to generate random start points uniformly
in each component. However, if your problem has far-flung m inima, you need
widely dispersed start points to find these minima.
Use these methods to obtain widel y dispersed start points:
• Give widely separated bounds in your
• Use a
RandomStartPointSet object with the MultiStart algorithm. Set a
large value of the
ArtificialBound property in the RandomStartPointSet
problem structure.
object.
• Use a
CustomStartPointSet object with the MultiStart algorithm. Use
widely dispersed start points.
There are advantages and disadvantages of each method.
MethodAdvantagesDisadvantages
Give bo unds in problem
Automatic point generation
Can use with GlobalSearch
Makes a more complex Hessian
Unclear how large to set the
bounds
Large ArtificialBound in
RandomStartPointSet
Easy to do
Bounds can be asymmetric
Automatic point generation
Does not change problem
Changes
Only uniform points
MultiStart only
Only symmetric, uniform points
problem
Easy to doUnclear how large to set
ArtificialBound
3-48
MethodAdvantagesDisadvantages
CustomStartPointSet
Customizable
MultiStart only
Improving Results
Does not change problem
Requires programming for
generating points
Methods of Generating Start Points
• “Uniform Grid” on page 3-49
• “Perturbed Grid” on page 3-50
• “Widely Dispersed Points for Unconstrained Components” on page 3-50
Uniform Grid. To generate a uniform grid of start points:
1 Generate multidimensional arrays with ndgrid. Give the lower bound,
spacing, and upper bound for each component.
For example, to generate a set of three-dimensional arrays with
• First component from –2 through 0, spacing 0.5
• Second component from 0 through 2, spacing 0.25
• Third component from –10 through 5, spacing 1
[X,Y,Z] = ndgrid(-2:.5:0,0:.25:2,-10:5);
2 Place the arrays into a single matrix, with each row representing one start
point. For example:
W = [X(:),Y(:),Z(:)];
In this example, W isa720-by-3matrix.
3 Put the matrix into a CustomStartPointSet obje ct. For example:
custpts = CustomStartPointSet(W);
Call MultiStart run with the CustomStartPointSet object as the third
input. For example,
3-49
3 Us ing GlobalSearch and MultiStart
% Assume problem structure and ms MultiStart object exist
[x fval flag outpt manymins] = run(ms,problem,custpts);
Perturbed Grid. In teger start points can yield less robust solutions than
slightly perturbed start points.
To obtain a perturbed set o f start points:
1 Generate a matrix of start points as in steps 1–2 of “Uniform Grid” on
page 3-49.
2 Perturb the start points by adding a random normal matrix with 0 mean
and relatively small variance.
For the example in “Uniform Grid” on page 3-49, after making the
W matrix,
add a perturbation:
[X,Y,Z] = ndgrid(-2:.5:0,0:.25:2,-10:5);
W = [X(:),Y(:),Z(:)];
W = W + 0.01*randn(size(W));
3 Put the matrix into a CustomStartPointSet obje ct. For example:
custpts = CustomStartPointSet(W);
Call MultiStart run with the CustomStartPointSet object as the third
input. For example,
% Assume problem structure and ms MultiStart object exist
[x fval flag outpt manymins] = run(ms,problem,custpts);
Widely Dispersed Points for Unconstrained Components. Some
components of your problem can lack upperorlowerbounds. Forexample:
• Although no explicit bounds exist, there are levels that the components
cannot attain. For example, if one component represents the weight of a
single diamond, there is an implicit upper bound of 1 kg (the Hope Diamond
is under 10 g). In such a case, give the implicit bound as an upper bound.
• There truly is no upper bound. For example, the size of a computer file in
bytes has no effective upper bound. The largest size can be in gigabytes
or terabytes today, but in 10 years, who knows?
3-50
Improving Results
For truly unbounded components, you can use the following methods
of sampling. To generate approximately 1/n points in each region
(exp(n),exp(n+1)), use the following formula. If u is random and uniformly
distributed from 0 through 1, then r =2u – 1 is uniformly distributed between
–1 and 1. Take
()
yrre=
y is symmetric and random. For a variable b ounded below by lb,take
yue=+
Similarly, for variable bounded above by ub,take
yue=−
For example, suppose you have a three-dimensional problem with
()
sgn( ) exp/.1
()
()
lbexp/.1
()
()
ubexp/.1
−
−
−
•
x(1) > 0
x(2) < 100
•
x(3) unconstrained
•
To make 150 start points satisfying these constraints:
The following is a variant of this algorithm. Generate a number between 0
and infinity by the method for lower bounds. Use this number as the radius
of a point. Generate the other components of the point by taking random
numbers for each component and multiply by the radius. You can normalize
the random numbers, before multiplying by the radius, so their norm is 1. For
3-51
3 Us ing GlobalSearch and MultiStart
a worked example of this method, see “MultiStart Without Bounds, Widely
Dispersed Start Points” on page 3-81.
Example: Searching for a Better Solution
MultiStart fails to find the global minimum in “Multiple Local Minima Via
This time MultiStart found the global m inimum, and found 53 total local
minima.
To see the range of local solutions, enter
hist([manymins.Fval]).
3-53
3 Us ing GlobalSearch and MultiStart
Tighter Bound on the Start Points. Suppose you believe that the
interesting local solutions have absolute values of all components less
than 100. The default value of the bound on start points is 1000. To use
a different value of the bound, generate a