Questions tagged [autodiff]

Automatic Differentiation (AD) is a set of techniques based on the mechanical application of the chain rule to obtain derivatives of a function given as a computer program.

Automatic Differentiation (AD) is a set of techniques based on the mechanical application of the chain rule to obtain derivatives of a function given as a computer program.
AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations such as additions or elementary functions such as exp().
By applying the chain rule of derivative calculus repeatedly to these operations, derivatives of arbitrary order can be computed automatically, and accurate to working precision.

Conceptually, AD is different from symbolic differentiation and approximations by divided differences.

AD is used in the following areas:

  • Numerical Methods
  • Sensitivity Analysis
  • Design Optimization
  • Data Assimilation & Inverse Problems

Home page: http://www.autodiff.org/

94 questions
2
votes
1 answer

How to retrive differentiation results with Eigen::AutoDiffScalar

I am learning to use this library. Trying to differentitate a simple function, y = x^2, does not yield the expected results (dy/dx = 2x = 16 when x = 8). #include #include #include…
user2658323
  • 543
  • 4
  • 15
1
vote
0 answers

Numerical instability in computing gradient using Jax with nested fixed point

I have written some Jax code to compute a log likelihood of a mixed logit model given some inputs - a vector of parameters parms and some data X. The log likelihood function embeds a nested fixed point algorithm. I am also using Jax to automatically…
Ben K
  • 11
  • 2
1
vote
1 answer

How can I implement a vmappable sum over a dynamic range in Jax?

I want to implement something like the following Python function in Jax, and wrap it with a call to vmap. I want it to be fully reverse-mode differentiable (with respect to x) using grad(), even after the vmap. def f(x,kmax): return sum ([x**k for…
1
vote
1 answer

Confused about evaluating vector-Jacobian-product with non-identity vectors (JAX)

I'm confused about the meaning of evaluating vector-Jacobian-products when the vector used for the VJP is a non-identity row vector. My question pertains to vector-valued functions, not scalar functions like loss. I will show a concrete example…
1
vote
1 answer

How to use and interpret JAX Vector-Jacobian Product (VJP) for this example?

I am trying to learn how to find the Jacobian of a vector-valued ODE function using JAX. I am using the examples at https://implicit-layers-tutorial.org/implicit_functions/ That page implements its own ODE integrator and associated custom…
1
vote
0 answers

std::min, std::max and autoDiff

When implementing the ReLU function for AutoDiff, one of the methods used is the std::max function; other implementations (conditional statements) work correctly but a try to implement max functions returns only 0 in the whole range. On input…
Arek
  • 31
  • 3
1
vote
2 answers

How to wrap a numpy function to make it work with jax.numpy?

I have some Jax code that requires using auto differentiation and in part of the code, I would like to call a function from a library written in NumPy. When I try this now I get The numpy.ndarray conversion method __array__() was called on the JAX…
Pablo
  • 123
  • 5
1
vote
1 answer

Getting the expected dimensions of the Jacobian with JAX?

I am trying to get the Jacobian for a simple parameterization function within JAX. The code is as follows: # imports import jax import jax.numpy as jnp from jax import random # simple parameterization function def reparameterize(v_params): …
hasco641
  • 69
  • 5
1
vote
1 answer

How to use R's optim() function with a function returning both the function value and the gradient?

I have written a function which evaluates the Laplace approximate marginal likelihood of complex mixed models. The implementation uses sparse matrices with Eigen/RcppEigen, and the C++ autodiff library. It is available by installing the development…
Øystein S
  • 536
  • 2
  • 11
1
vote
1 answer

Tensorflow Gradient tape 'unknown value for unconnected gradients'

I'm trying to understand why im getting an error when using gradient tape to take the derivative of a function. Try to take the derivative of Power with respect to T, defined as: import tensorflow as tf import numpy as np from scipy.fft…
J Crane
  • 11
  • 2
1
vote
1 answer

Jax: Take derivative with respect to index of vector-valued argument

Does Jax support taking the derivate w.r.t. an index of a vector-valued variable? Consider this example (where a is a vector/array): def test_func(a): return a[0]**a[1] I can pass in the argument number into grad(..), but I cannot seem to pass…
user654123
  • 465
  • 6
  • 19
1
vote
1 answer

Compute partial derivatives with `madness`

The madness packages, as mentioned here, is nice for autodiff in R. I would like to compute now a derivative wrt x of a derivative wrt y. $\frac{\partial}{\partial x}\frac{\partial}{\partial y}xy$ how can this be done using madness? update:…
Sam Weisenthal
  • 2,791
  • 9
  • 28
  • 66
1
vote
1 answer

Swift AutoDiff: How do we make a struct have a member variable that is a differentiable function of more than one parameter?

I would like to have the following: import _Differentiation struct S { var f: @differentiable(reverse) (Double, Double) -> Double } but the compiler complains Error: Abort trap: 6 and the start of the stack trace is Assertion failed:…
bjschoenfeld
  • 402
  • 6
  • 14
1
vote
1 answer

How to write a JAX custom vector-Jacobian product (vjp) for softmax

In order to understand JAX's reverse mode auto-diff I tried to write a custom_vjp for softmax like this: import jax import jax.numpy as jnp import numpy as np @jax.custom_vjp def stablesoftmax(x): print(f"input: {x} shape: {x.shape}") expc…
pups
  • 82
  • 7
1
vote
1 answer

Auto-derivative functions in rcpparmadillio?

I want to calculate the derivative of the function f by Rcpp. I just found some resources in https://cran.r-project.org/web/packages/StanHeaders/vignettes/stanmath.html, which use stan headers and rcppEigen. Since all my program is coded by…
Xia.Song
  • 416
  • 3
  • 15