My long subject title pretty much covers it.
I have managed to isolate my much bigger problem in the following contrived example below. I cannot figure out where the problem exactly is, though I imagine it has something to do with the type of the preallocated array?
using ForwardDiff
function test()
A = zeros(1_000_000)
function objective(A, value)
for i=1:1_000_000
A[i] = value[1]
end
return sum(A)
end
helper_objective = v -> objective(A, v)
ForwardDiff.gradient(helper_objective, [1.0])
end
The error reads as follows:
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##69#71")){Array{Float64,1},getfield(Main, Symbol("#objective#70")){Array{Float64,1}}},Float64},Float64,1})
In my own problem (not described here) I have a function that I need to optimise using Optim, and the automatic differentiation it offers, and this function makes use of a big matrix that I would like to preallocate in order to speed up my code. Many thanks.