0

I just started a project in which I need to sample from a high-dimensional (right now ~10^3-dimensional, but should go to ~10^6) & highly non-gaussian posterior distribution via a Hamiltonian Monte Carlo (HMC) sampler. In my specific case the gradients of the potential can be computed analytically and the mass matrix can be crudely approximated. As far as I understand, most probabilistic programming packages (like stan, tf.probability, edward, pymcx, ...) use some sort of numerical/stochastic gradient computation/approximation which might be more time consuming, as well as less efficient for convergence.

Right now I have a very naive python implementation along the lines of this pseudocode which is way to slow for high dimensional computations. Do you know whether there exist any performance-boosting modification of this algorithm or if there already are good and fast samplers around that explicitly take those analytical results into account?

Peter O.
  • 32,158
  • 14
  • 82
  • 96
user6566791
  • 75
  • 1
  • 6
  • It will be helpful to see what you have right now – Sharky Apr 12 '19 at 21:39
  • I think this one of things one can do with [custom Theano Ops in PyMC3](https://docs.pymc.io/Advanced_usage_of_Theano_in_PyMC3.html#writing-custom-theano-ops), but I've never used it myself. Maybe ask over on [the PyMC Discourse page](https://discourse.pymc.io/). – merv Apr 12 '19 at 22:12
  • This other question could also be useful: https://stackoverflow.com/questions/52759259/defining-grad-of-a-custom-op-theano – merv Apr 13 '19 at 02:39

0 Answers0