Updates
One of the most central objects in SimpleSolvers are update! routines. They can be used together with many different types and structs:
SimpleSolvers.update!(::Hessian, ::AbstractVector): this routine exists for mostHessians, i.e. forHessianFunction,HessianAutodiff,HessianBFGSandHessianDFP,SimpleSolvers.update!(::SimpleSolvers.NewtonSolverCache, ::AbstractVector),SimpleSolvers.update!(::SimpleSolvers.NonlinearSolverStatus, ::AbstractVector, ::Base.Callable),SimpleSolvers.update!(::SimpleSolvers.NewtonOptimizerCache, ::AbstractVector, ::AbstractVector, ::Hessian),SimpleSolvers.update!(::SimpleSolvers.NewtonOptimizerState, ::AbstractVector).SimpleSolvers.update!(::SimpleSolvers.OptimizerResult, ::AbstractVector, ::AbstractVector, ::AbstractVector).
So update! always takes an object that has to be updated and a single vector in the simplest case. For some methods more arguments need to be provided.
Examples
Hessian
If we look at the case of the Hessian, we store a matrix $H$ that has to be updated in every iteration. We first initialize the matrix[1]:
f = x -> sum(x .^ 3 / 6 + x .^ 2 / 2)
x = [1., 0., 0.]
hes = Hessian(f, x; mode = :autodiff)
hes.H3×3 Matrix{Float64}:
NaN NaN NaN
NaN NaN NaN
NaN NaN NaNAnd then update:
update!(hes, x)
hes.H3×3 Matrix{Float64}:
2.0 0.0 0.0
0.0 1.0 0.0
0.0 0.0 1.0NewtonOptimizerCache
In order to update an instance of SimpleSolvers.NewtonOptimizerCache we have to supply a value of the Gradient and the Hessian in addition to x:
grad = Gradient(f, x; mode = :autodiff)
cache = NewtonOptimizerCache(x)
update!(cache, x, grad, hes)SimpleSolvers.NewtonOptimizerCache{Float64, Vector{Float64}}([1.0, 0.0, 0.0], [1.0, 0.0, 0.0], [-0.75, -0.0, -0.0], [1.5, 0.0, 0.0], [-1.5, -0.0, -0.0])We note that when calling update! on the NewtonOptimizerCache, the Hessian hes is not automatically updated! This has to be done manually.
Calling update! on the NewtonOptimizerCache updates everything except x as this in general requires another line search!
In order that we do not have to update the Hessian and the SimpleSolvers.NewtonOptimizerCache separately we can use SimpleSolvers.NewtonOptimizerState:
obj = MultivariateObjective(f, x)
state = NewtonOptimizerState(x)
update!(state, x, Gradient(obj), hes)SimpleSolvers.NewtonOptimizerState{Float64, SimpleSolvers.BacktrackingState{Float64}, SimpleSolvers.NewtonOptimizerCache{Float64, Vector{Float64}}}(Backtracking with α₀ = 1.0, ϵ = 0.0001and p = 0.5., SimpleSolvers.NewtonOptimizerCache{Float64, Vector{Float64}}([1.0, 0.0, 0.0], [1.0, 0.0, 0.0], [-0.75, -0.0, -0.0], [1.5, 0.0, 0.0], [-1.5, -0.0, -0.0]))OptimizerResult
We also show how to update an instance of SimpleSolvers.OptimizerResult:
result = OptimizerResult(x, obj)
update!(result, x, obj, grad)SimpleSolvers.OptimizerResult{Float64, Float64, Vector{Float64}, SimpleSolvers.OptimizerStatus{Float64, Float64}}(
* Iterations
n = 1
* Convergence measures
|x - x'| = NaN
|x - x'|/|x'| = NaN
|f(x) - f(x')| = NaN
|f(x) - f(x')|/|f(x')| = NaN
|g(x) - g(x')| = NaN
|g(x)| = 1.50e+00
, [1.0, 0.0, 0.0], 0.6666666666666666, [1.5, 0.0, 0.0])Note that the residuals are still NaNs here. In order to get proper values for these we have to perform two updating steps:
x₂ = [.9, 0., 0.]
update!(result, x₂, obj, grad)SimpleSolvers.OptimizerResult{Float64, Float64, Vector{Float64}, SimpleSolvers.OptimizerStatus{Float64, Float64}}(
* Iterations
n = 2
* Convergence measures
|x - x'| = 1.00e-01
|x - x'|/|x'| = 1.11e-01
|f(x) - f(x')| = 1.40e-01
|f(x) - f(x')|/|f(x')| = 2.66e-01
|g(x) - g(x')| = 1.95e-01
|g(x)| = 1.31e+00
, [0.9, 0.0, 0.0], 0.5265000000000001, [1.3050000000000002, 0.0, 0.0])NewtonOptimizerCache, OptimizerResult and NewtonOptimizerState (through MultivariateObjective) all store things that are somewhat similar, for example x. This may make it somewhat difficult to keep track of all the things that happen during optimization.
An Optimizer stores a MultivariateObjective, an SimpleSolvers.OptimizerResult and an OptimizationAlgorithm (and therefore the MultivariateObjective again). We also give an example:
opt = Optimizer(x, obj)
update!(opt, x)
* Algorithm: BFGS
* Linesearch: Backtracking with α₀ = 1.0, ϵ = 0.0001and p = 0.5.
* Iterations
n = 1
* Convergence measures
|x - x'| = NaN ≰ 4.4e-16
|x - x'|/|x'| = NaN ≰ 4.4e-16
|f(x) - f(x')| = NaN ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = NaN ≰ 4.4e-16
|g(x)| = 1.50e+00 ≰ 1.5e-08
* Candidate solution
Final solution value: [1.000000e+00, 0.000000e+00, 0.000000e+00]
Final objective value: 6.666667e-01
Equivalent to calling update! on SimpleSolvers.OptimizerResult, the diagnostics cannot be computed with only one iterations; we have to compute a second one:
x₂ = [.9, 0., 0.]
update!(opt, x₂)
* Algorithm: BFGS
* Linesearch: Backtracking with α₀ = 1.0, ϵ = 0.0001and p = 0.5.
* Iterations
n = 2
* Convergence measures
|x - x'| = 1.00e-01 ≰ 4.4e-16
|x - x'|/|x'| = 1.11e-01 ≰ 4.4e-16
|f(x) - f(x')| = 1.40e-01 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 2.66e-01 ≰ 4.4e-16
|g(x)| = 1.31e+00 ≰ 1.5e-08
* Candidate solution
Final solution value: [9.000000e-01, 0.000000e+00, 0.000000e+00]
Final objective value: 5.265000e-01
We note that simply calling update! on an instance of SimpleSolvers.Optimizer is not enough to perform a complete iteration since the computation of a new $x$ requires a line search procedure in general.
We also note that update! always returns the first input argument.
- 1The constructor uses the function
SimpleSolvers.initialize!.