0

I'm trying to implement the automatic difference method by using the ForwardDiff packages.

using LinearAlgebra
import ForwardDiff

function OLS(X,Y,beta)
    f = (Y - X*beta)'*(Y - X*beta)   
end

n = 100
beta = [1.0, 2.2]
X = [ones(n) rand(n)]
Y = X*beta + randn(n) 
beta_hat = inv(X'X)*X'Y;

beta_AD = ForwardDiff.gradient(OLS,beta_hat)

The error message follows. How can I fix this problem? Thanks

ERROR: LoadError: MethodError: no method matching OLS(::Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(OLS), Float64}, Float64, 2}})
Closest candidates are:
  OLS(::Any, ::Any, ::Any) at C:\Users\zhanglu\Documents\Julia\lecture-julia.notebooks\AD_Examples.jl:4
Stacktrace:
 [1] vector_mode_dual_eval!(f::typeof(OLS), cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(OLS), Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(OLS), Float64}, Float64, 2}}}, x::Vector{Float64})
zlqs1985
  • 509
  • 2
  • 8
  • 25

1 Answers1

2

When you call ForwardDiff.gradient for your function, you should define the function as depending on the vector beta, and X, Y are fixed.

import ForwardDiff

function OLS(beta::Vector{T};  X=zeros(10, 2), Y=zeros(10)) where T <:Real
    f = (Y - X*beta)'*(Y - X*beta)   
end
n = 100
beta = [1.0, 2.2]
X = [ones(n) rand(n)]
Y = X*beta + randn(n) 
beta_hat = inv(X'X)*X'Y;
beta_AD = ForwardDiff.gradient(beta->OLS(beta; X=X, Y=Y), beta_hat)
xecafe
  • 740
  • 6
  • 12