Is there a way in JuMP to define a function that returns for example a tuple containing the objective function and the gradient, which you can pass to register?
In the documentation : https://jump.dev/JuMP.jl/stable/manual/nlp/#Register-a-function-and-gradient
I see you can do this
f(x) = x^2
∇f(x) = 2x
∇²f(x) = 2
model = Model()
register(model, :my_square, 1, f, ∇f, ∇²f)
@variable(model, x >= 0)
@NLobjective(model, Min, my_square(x))
i.e define the functions seperatley. In my context I'm solving an optimal control problem where in one fuction I can get the objective and gradient in one go. Is there a way when I set autodiff=false that I can just make a function that returns the objective and gradient and do gradient based optimization, or returns all three to do hessian optimization based on my needs?
Also, how do you make the symbols like the nabla and nabla squared? Do you need to use a special editor?