2

I have the following function

error = prior_error(data,s_vals,ones(20)/20)

This executes fine. I now package the arguments up as follows

inputs = (data,s_vals,ones(20)/20)

And then try and get the gradient

test = gradient(prior_error,inputs)

Using the gradient function from the ReverseDiff package. The type signature of the arguments for the prior_error function is as follows

prior_error(data::Matrix,sample_vals::Vector,prior::Vector)

But I am getting the following error when I try and get that gradient


MethodError: no method matching prior_error(::ReverseDiff.TrackedArray{Float64, Float64, 2, Matrix{Float64}, Matrix{Float64}}, ::ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}, ::ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}})

ReverseDiff.GradientTape(::Function, ::Tuple{Matrix{Float64}, Vector{Float64}, Vector{Float64}}, ::ReverseDiff.GradientConfig{Tuple{ReverseDiff.TrackedArray{Float64, Float64, 2, Matrix{Float64}, Matrix{Float64}}, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}})@tape.jl:207
gradient(::Function, ::Tuple{Matrix{Float64}, Vector{Float64}, Vector{Float64}}, ::ReverseDiff.GradientConfig{Tuple{ReverseDiff.TrackedArray{Float64, Float64, 2, Matrix{Float64}, Matrix{Float64}}, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}})@gradients.jl:22
top-level scope@Local: 5

I'm afraid, being relatively new to both Julia and the ReverseDiff package in Julia, I don't even know where to start trying to solve this error. Any advice on how to fix would be greatly appreciated!

UPDATE

I figured out its because I'm assigning too strict a type signature, so actually leaving the functions without a type signature allows the autodiff to progress UNTIL I try and do a matrix multiplication, and I get this error


MethodError: *(::ReverseDiff.TrackedArray{Float64, Float64, 2, Matrix{Float64}, Matrix{Float64}}, ::LinearAlgebra.Diagonal{ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}, ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}}) is ambiguous. Candidates:

*(A::AbstractMatrix, D::LinearAlgebra.Diagonal) in LinearAlgebra at /home/peter/julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/diagonal.jl:222

*(x::ReverseDiff.TrackedArray{V, D, 2}, y::AbstractMatrix) where {V, D} in ReverseDiff at /home/peter/.julia/packages/ReverseDiff/GtPeW/src/derivatives/linalg/arithmetic.jl:213

*(x::ReverseDiff.TrackedArray{V, D, 2}, y::AbstractArray) where {V, D} in ReverseDiff at /home/peter/.julia/packages/ReverseDiff/GtPeW/src/derivatives/linalg/arithmetic.jl:213

*(x::ReverseDiff.TrackedArray{V, D}, y::AbstractMatrix) where {V, D} in ReverseDiff at /home/peter/.julia/packages/ReverseDiff/GtPeW/src/derivatives/linalg/arithmetic.jl:213

*(A::AbstractMatrix, B::AbstractMatrix) in LinearAlgebra at /home/peter/julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/matmul.jl:151

*(x::ReverseDiff.TrackedArray{V, D}, y::AbstractArray) where {V, D} in ReverseDiff at /home/peter/.julia/packages/ReverseDiff/GtPeW/src/derivatives/linalg/arithmetic.jl:213

Possible fix, define

*(::ReverseDiff.TrackedArray{V, D, 2}, ::LinearAlgebra.Diagonal) where {V, D}

But I don't understand what this means. Am I literally writing out a function with that type signature? Do I need to manually code in a function for matrix multiplication?

Pablo
  • 220
  • 2
  • 13
  • 1
    Unfortunately these ambiguities are hard to avoid, as `*` has so many methods, it's a problem with the TrackedMatrix idea. You can try to define a method [like these](https://github.com/JuliaDiff/ReverseDiff.jl/blob/master/src/derivatives/linalg/arithmetic.jl#L195) with the given signature, and `record_mul` there will handle things further. Opening an issue at ReverseDiff would also be a good idea. – mcabbott Oct 15 '22 at 04:06

0 Answers0