Questions tagged [automatic-differentiation]

Also known as algorithmic differentiation, short AD. Techniques that take a procedure evaluating a numerical function and transform it into a procedure that additionally evaluates directional derivatives, gradients, higher order derivatives.

Also known as algorithmic differentiation, short AD. Techniques that take a procedure evaluating a numerical function and transform it into a procedure that additionally evaluates directional derivatives, gradients, higher order derivatives.

Techniques include operator

  • overloading for dual numbers,
  • operator overloading to extract the operations sequence as a tape,
  • code analysis and transformation.

For a function with input of dimension n and output of dimension n, requiring L elementary operations for its evaluation, one directional derivative or one gradient can be computed with 3*L operations.

The accuracy of the derivative is, automatically, nearly as good as the accuracy of the function evaluation.

Other differentiation method are

  • symbolic differentiation, where the expanded expression for the derivatives is obtained first, which can be large depending on the implementation, and
  • numerical differentiation by divided differences, which provides less accuracy with comparable effort, or comparable accuracy with a higher effort.

See wikipedia and autodiff.org

192 questions
0
votes
1 answer

Automatic Differentiation:numerical or exact?

I have a question about automatic differentiation and especially in Pytorch since i am using this library.I have seen for instance automatic differentiation give the partial derivatives of an expression with respect to a variable. However,as far as…
sosamm
  • 39
  • 1
  • 6
0
votes
0 answers

TensorFlow GradienTape crashes with InvalidArgumentError when working on sparse Tensors

I am encountering strange behavior when trying to evaluate the derivatives of a result obtained by sparse tensor operations. If I blow up all sparse inputs to dense before operating on them, the following code works as expected (first part of the…
Franz
  • 340
  • 1
  • 10
0
votes
1 answer

How to use tf.gradients within a model and still use a custom training loop?

I would like to make a TensorFlow model where the outputs respect a mathematical condition, namely that output 0 is a scalar function and all subsequent outputs are its partial derivatives w.r.t. the input. This is because my observations are the…
0
votes
0 answers

Is sympy solve compatible with tensorflow GradientTape?

I want to solve p for Vo using sympy solve, an algorithm that solves an equation without an initial value. Then, I want to find the derivative of p for Vo using TensorFlow's automatic differentiation. I wrote the code below, and the value was…
0
votes
0 answers

No gradients provided for any variable - Custom loss function with random weights depending on the Softmax output

I have difficulties writing a custom loss function that makes use of some random weights generated according to the class/state predicted by the Softmax output. The desired property is: The model is a simple feedforward neural network with…
0
votes
1 answer

Edge Pushing Algorithm for computing sparse Hessian

I'm trying to implement some of the AD algorithms myself but I don't quite get the edge pushing algorithm by Gower and Mello for computing sparse Hessian. Does a new computational graph of the "original gradient" need to be generated (for example…
0
votes
1 answer

Automatically change the version of only assembly which has code changes in Dot Net

I am creating patch for my application to deploy small changes to customer. In my application I have 100 .CSProject. Out of 100 library I made code changes in class library A, B, C and Library D is calling to A,B and C library. So Is there any way…
0
votes
2 answers

automatic differentiation vector-Jacobian products in linear time?

I'm new to the inner workings of automatic differentiation and came across some papers and slides that state that vector-Jacobian products can be computed in linear time using automatic differentiation. Specifically written: $e^\top (…
0
votes
1 answer

How to assign equations element by element in autograd

I am trying to implement an autograd-based solver for a nonlinear PDE. As with most PDE's, I need to be able to operate in individual entries of my input vector, but apparently this breaks autograd. I have created this simple example to show the…
0
votes
0 answers

Resize Eigen::Matrix resulting from CwiseUnaryOperation for automatic differentiation (AD) types

This problem is related to Workaround for resizing Eigen::Ref, however, I do not have the restriction of trying to avoid templates (in-fact, I would like to have a solution working with templates) I'm using the the eigen library (version 3.2.9, but…
tom
  • 361
  • 3
  • 11
0
votes
1 answer

Using autograd on python wrapped C function with custom datatype

Can autograd in principle work on python wrapped C functions? The C function I'd like to differentiate expects arguments with a REAL8 data type, and I can successfully call it in python by giving it either float or np.float64 arguments. Upon closer…
0
votes
2 answers

Restricting function signatures while using ForwardDiff in Julia

I am trying to use ForwardDiff in a library where almost all functions are restricted to only take in Floats. I want to generalise these function signatures so that ForwardDiff can be used while still being restrictive enough so functions only take…
Stuart
  • 1,322
  • 1
  • 13
  • 31
0
votes
2 answers

How to get access to the partial derivatives of output with respect to inputs in deep learning model?

I want to create my own loss function in keras, which contains derivatives. For example, def my_loss(x): def y_loss(y_true,y_pred): res = K.gradients(y_pred,x) return res return y_loss is defined, and model =…
CSH
  • 497
  • 1
  • 5
  • 16
0
votes
1 answer

Direct access to the automatic differentiation inside pyomo

Is it possible to directly access the automatic differentiation module coming with pyomo? By that, I mean, could I compute derivatives of any objective function (defined or not inside pyomo's interface) using pyomo?
Ben
  • 1
0
votes
1 answer

Is Julia ForwardDiff applicable to very comprehensive function involving ODE integration and nested automatic differentiation?

I need to estimate parameters of continuous-discrete nonlinear stochastic dynamic system using Kalman filtering techniques. I'm going to use Julia ode45() from ODE and implement Extended Kalman Filter by myself to compute loglikelihood. ODE is…
konstunn
  • 355
  • 4
  • 17
1 2 3
12
13