I would like to do a Bayesian fit of some experimental observations.
The function that wraps up my model is not analytical and requires several numeric integrals. Therefore, it requires some time (approximately ~0.5 s) to be evaluated. This becomes a problem since the fitting procedure calls many times the function, each time varying its parameters.
I attempted to make the code run faster by vectorizing it as much as possible (with numpy
) and I also used a just in time compiler (numba.jit
). I gained a factor 2 in the computation times, but still this is not enough.
I am aiming at computation times of the order of milliseconds.
I thought that a reasonable solution could be to compute in advance the values of the function for a grid of the input parameters and make a lookup table.
I would like to know which is the best way to implement this. Expecially regarding:
- How to save the precomputed values of the function to maximize the speed (data structure, file format)?
- How to open/access the precomputed data?
- Which interpolation technique to use for evaluating the function on some value of the parameters which are not on the lookup table.