Questions tagged [autodiff]

Automatic Differentiation (AD) is a set of techniques based on the mechanical application of the chain rule to obtain derivatives of a function given as a computer program.

Automatic Differentiation (AD) is a set of techniques based on the mechanical application of the chain rule to obtain derivatives of a function given as a computer program.
AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations such as additions or elementary functions such as exp().
By applying the chain rule of derivative calculus repeatedly to these operations, derivatives of arbitrary order can be computed automatically, and accurate to working precision.

Conceptually, AD is different from symbolic differentiation and approximations by divided differences.

AD is used in the following areas:

  • Numerical Methods
  • Sensitivity Analysis
  • Design Optimization
  • Data Assimilation & Inverse Problems

Home page: http://www.autodiff.org/

94 questions
0
votes
1 answer

Does autodiff in tensorflow work if I have a for loop involved in constructing my graph?

I have a situation where I have a batch of images and in each image I have to perform some operation over a tiny patch in that image. Now the problem is the patch size is variable in each image in the batch. So this implies that I cannot vectorize…
0
votes
1 answer

Does auto-differentiation in tensorflow work when combining activations from multiple nets into one objective?

I am new to tensorflow and trying to figure out if the auto-differentiation feature in tensorflow will solve my problem. So I have two nets , where each net outputs a latent vector. So let's say my net A outputs latent vector -La(Hxr) - where (H,r)…
0
votes
2 answers

Julia: Optimize a cost function with `Optim.jl` and `autodiff` for integers

I like to optimize (minimize) the following given function (quad_function) by using Optim.jl with automatic differentiation (autodiff=true). My objective function rounds Real values to whole numbers and is therefore step-like. As I use the autodiff…
swiesend
  • 1,051
  • 1
  • 13
  • 23
-2
votes
1 answer

How to install the "FastAD" C++ library in Rcpp?

I am trying to install a C++ library called FastAD ( https://github.com/JamesYang007/FastAD#user-guide) in Rcpp but the installation instructions are generic (not specifically for Rcpp). It would be greatly appreciated if someone coule give me some…
E_1996
  • 33
  • 5
1 2 3 4 5 6
7