0

I am trying to use the in-built HMC sampler of tensorflow-probability to generate samples from the posterior. According to documentation, it seems like one has to provide (possibly unnormalized) log density of posterior to target_log_prob_fn callable and tensorflow-probability automatically computes its gradient (with respect to parameters to be inferred) to perform Hamiltonian MCMC updates.

However for my application, the likelihood and the gradient of resulting posterior are computed outside of tensorflow (it involves solution of a partial differential equation and I can compute it efficiently using some other python library). So I was wondering is there a way I can somehow directly pass target_log_prob_fn the (unnormalized) log density of posterior and its gradient to perform Hamiltonian MCMC update? In other words, is there a way I can ask the HMC sampler to use the gradients provided by me to perform MCMC update?

I found a related question over here, but it does not exactly answer my question.

  • It is probably easier if you ditch tf and look for another implementation of hmc – BlackBear Mar 23 '20 at 10:42
  • Do you have any suggestion? Part of my model is in tf and hence I tried using tfp as a first try. However, I can easily modify my code to make it work with different python library as well. I tried using [pyhmc](https://pythonhosted.org/pyhmc/index.html), but it does seem to be highly sensitive to parameters like step size, no. of leapfrog steps etc and hence I am looking for some other efficient hmc implementation/library where I can provide my custom gradients with automatic/adaptive hmc parameter tuning capability. – log_posterior Mar 24 '20 at 01:03

0 Answers0