1

How can I evaluate the Hessian of a function in Julia using automatic differentiation (preferably using ReverseDiffSparse)? In the following example, I can compute and evaluate the gradient at a point values through JuMP:

m = Model()
@variable(m, x)
@variable(m, y)

@NLobjective(m, Min, sin(x) + sin(y))
values = zeros(2)
values[linearindex(x)] = 2.0
values[linearindex(y)] = 3.0

d = JuMP.NLPEvaluator(m)
MathProgBase.initialize(d, [:Grad])
objval = MathProgBase.eval_f(d, values) # == sin(2.0) + sin(3.0)

∇f = zeros(2)
MathProgBase.eval_grad_f(d, ∇f, values)
# ∇f[linearindex(x)] == cos(2.0)
# ∇f[linearindex(y)] == cos(3.0)

Now I want the Hessian at values. I've tried changing the below:

MathProgBase.initialize(d, [:Grad,:Hess])
H = zeros(2);  # based on MathProgBase.hesslag_structure(d) = ([1,2],[1,2])
MathProgBase.eval_hesslag(d, H, values, 0, [0])

but am getting an error.

For reference, cross-post is on the helpful Julia Discourse forum here.

jjjjjj
  • 1,152
  • 1
  • 14
  • 30
  • 1
    This is answered [here](https://discourse.julialang.org/t/jump-interface-access-to-automatic-differentiation-reversediffsparse/8354/4?u=miles.lubin). Please link when you cross-post. – mlubin Jan 15 '18 at 04:49

0 Answers0