Quadratic Line Search
Quadratic line search is based on making a quadratic approximation to an optimizer problem and then pick the minimum of this quadratic approximation as the next iteration of $\alpha$.
The quadratic polynomial is built the following way[1]:
\[p(\alpha) = f^\mathrm{ls}(0) + (f^\mathrm{ls})'(0)\alpha + p_2\alpha^2,\]
and we also call $p_0:=f^\mathrm{ls}(0)$ and $p_1:=(f^\mathrm{ls})'(0)$. The coefficient $p_2$ is then determined the following way:
- take a value $\alpha$ (typically found with
bracket_minimum_with_fixed_point) and compute $y = f^\mathrm{ls}(\alpha)$, - set $p_2 \gets \frac{(y - p_0 - p_1\alpha)}{\alpha^2}.$
After the polynomial is found we then take its minimum (analogously to the Bierlaire quadratic line search) and check if it satisfies the sufficient decrease condition. If it does not satisfy this condition we repeat the process.
Example
Here we treat the following problem:
f(x::Union{T, Vector{T}}) where {T<:Number} = exp.(x) .* (x .^ 3 .- 5x .+ 2x) .+ 2one(T)
f!(y::AbstractVector{T}, x::AbstractVector{T}) where {T} = y .= f.(x)
F!(y::AbstractVector{T}, x::AbstractVector{T}, params) where {T} = f!(y, x)

We now want to use quadratic line search to find the root of this function starting at $x = 0$. We initialize a line search problem:
using SimpleSolvers
# allocate solver
solver = NewtonSolver(x, f(x); F = F!)
# allocate and update state
state = NonlinearSolverState(x)
update!(state, x, f(x))
params = (x = state.x, parameters = NullParameters())
direction!(solver, x, params, iteration_number(state))
# allocate linesearch problem
ls_obj = linesearch_problem(nonlinearproblem(solver), jacobian(solver), cache(solver))
# get fˡˢ & ∂fˡˢ∂α (stored in the linesearch problem)
fˡˢ(alpha) = ls_obj.F(alpha, params)
∂fˡˢ∂α(alpha) = ls_obj.D(alpha, params)

The second plot shows the optimization problem for the ideal step length, where we start from $x_0$ and proceed in the Newton direction. In the following we want to determine its minimum by fitting a quadratic polynomial, i.e. fitting $p$.
The first two coefficient of the polynomial $p$ (i.e. $p_1$ and $p_2$) are easy to compute:
p₀ = fˡˢ(0.)4.0p₁ = ∂fˡˢ∂α(0.)-8.0Finding $p_2$
We call bracket_minimum_with_fixed_point to find $\alpha_0.$ For this point we should have $f^\mathrm{ls}(\alpha_0) > f^\mathrm{ls}(0).$
using SimpleSolvers: bracket_minimum_with_fixed_point
update!(state, x, f(x))
params = (x = state.x, parameters = NullParameters())
α₀ = bracket_minimum_with_fixed_point(ls_obj, params, 0.)[2]1.28

We can now finally compute $p_2$ and determine the minimum of the polynomial:
y = fˡˢ(α₀)
p₂ = (y - p₀ - p₁*α₀) / α₀^2
p(α) = p₀ + p₁ * α + p₂ * α^2
α₁ = -p₁ / (2p₂)0.5141386947276962

We now check wether $\alpha_1$ satisfies the sufficient decrease condition:
sdc = SufficientDecreaseCondition(DEFAULT_WOLFE_c₁, fˡˢ(0.), ∂fˡˢ∂α(0.), fˡˢ)
sdc(α₁)trueWe now move the original $x$ in the Newton direction with step length $\alpha_1$ by using compute_new_iterate!:
compute_new_iterate!(x, α₁, direction(cache(solver)))1-element Vector{Float64}:
0.3427591298184641

And we see that we already very close to the root.
Example II
Here we consider the same example as when discussing the Bierlaire quadratic line search.
state = NonlinearSolverState(x)
update!(state, x, f(x))
params = (x = state.x, parameters = NullParameters())
ls_obj = linesearch_problem(nlp, JacobianFunction{Float64}(F!, J!), cache(solver))
fˡˢ(alpha) = ls_obj.F(alpha, params)
∂fˡˢ∂α(alpha) = ls_obj.D(alpha, params)We now try to find a minimum of $f^\mathrm{ls}$ with quadratic line search. For this we first need to find a bracket; we again do this with SimpleSolvers.bracket_minimum_with_fixed_point:
(a, b) = SimpleSolvers.bracket_minimum_with_fixed_point(fˡˢ, 0.)(0.0, 5.12)We plot the bracket:

We now build the polynomial:
p₀ = fˡˢ(a)
p₁ = ∂fˡˢ∂α(a)
y = fˡˢ(b)
p₂ = (y - p₀ - p₁*b) / b^2
p(α) = p₀ + p₁ * α + p₂ * α^2and compute its minimum:
αₜ = -p₁ / (2p₂)2.446514989327422

We now set $a \gets \alpha_t$ and perform another iteration:
(a, b) = SimpleSolvers.bracket_minimum_with_fixed_point(fˡˢ, αₜ)(-0.1034850106725782, 2.4565149893274216)We again build the polynomial:
p₀ = fˡˢ(a)
p₁ = ∂fˡˢ∂α(a)
y = fˡˢ(b)
p₂ = (y - p₀ - p₁*(b-a)) / (b-a)^2
p(α) = p₀ + p₁ * (α-a) + p₂ * (α-a)^2and compute its minimum:
αₜ = -p₁ / (2p₂) + a1.285313587994457

- 1This is different from the Bierlaire quadratic polynomial described in [3].