Skip to content

Commit 49b7ea5

Browse files
Merge pull request #317 from DanielVandH/patch-2
Fix MultistartOptimization documentation
2 parents d07cfe7 + 92116a7 commit 49b7ea5

File tree

1 file changed

+13
-10
lines changed

1 file changed

+13
-10
lines changed

docs/src/optimization_packages/multistartoptimization.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -3,30 +3,32 @@
33

44
`MultistartOptimization` requires both a global and local method to be defined. The global multistart method chooses a set of initial starting points from where local the local method starts from.
55

6-
Currently, only one global method (`TikTak`) is implemented and called by `MultiStartOptimization.TikTak(n)` where `n` is the number of initial Sobol points.
6+
Currently, only one global method (`TikTak`) is implemented and called by `MultistartOptimization.TikTak(n)` where `n` is the number of initial Sobol points.
77

8-
## Installation: OptimizationMultiStartOptimization.jl
8+
## Installation: OptimizationMultistartOptimization.jl
99

10-
To use this package, install the OptimizationMultiStartOptimization package:
10+
To use this package, install the OptimizationMultistartOptimization package:
1111

1212
```julia
13-
import Pkg; Pkg.add("OptimizationMultiStartOptimization")
13+
import Pkg; Pkg.add("OptimizationMultistartOptimization")
1414
```
1515
!!! note
1616

17-
You also need to load the relevant subpackage for the local method of you choice, for example if you plan to use one of the NLopt.jl's optimizers, you'd install and load OptimizationNLopt as described in the [NLopt.jl](@ref)'s section.
17+
You also need to load the relevant subpackage for the local method of you choice, for example if you plan to use one of the NLopt.jl's optimizers, you'd install and load OptimizationNLopt as described in the [NLopt.jl](@ref)'s section.
1818

1919
## Global Optimizer
2020
### Without Constraint Equations
2121

22-
The methods in [`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) is performing global optimization on problems without
22+
The methods in [`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) are performing global optimization on problems without
2323
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.
2424

2525
## Examples
2626

2727
The Rosenbrock function can optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:
2828

2929
```julia
30+
using OptimizationMultistartOptimization
31+
using OptimizationNLopt
3032
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
3133
x0 = zeros(2)
3234
p = [1.0, 100.0]
@@ -35,14 +37,15 @@ prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.
3537
sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())
3638
```
3739

38-
You can use any `Optimization` optimizers you like. The global method of the `MultiStartOptimization` is a positional argument and followed by the local method. This for example means we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you access and adjust all the optimizer settings as you normally would:
40+
You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. This for example means we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you access and adjust all the optimizer settings as you normally would:
3941

4042
```julia
43+
using OptimizationOptimJL
44+
using ForwardDiff
4145
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
4246
x0 = zeros(2)
4347
p = [1.0, 100.0]
44-
f = OptimizationFunction(rosenbrock)
48+
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
4549
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.0])
46-
sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())
47-
sol = solve(prob, MultistartOptimization.TikTak(100), LBFGS())
50+
sol = solve(prob, MultistartOptimization.TikTak(100), LBFGS(), maxiters=5)
4851
```

0 commit comments

Comments
 (0)