I have a situation where multiple different Flask apps are infrequently used for real-time statistic computations.
In this case I need to have good performance when somebody is browsing one of the apps, and at the moment I have a nice and expensive cloud instance to serve them.
I would have liked to use a single Dask cluster to offload the computational heavy-lifting, but different Flask apps have different versions of the same libraries, and I cannot fix that. For example each app comes in paired environments (production and test) and those will always have different modules (by very definition).
As for what I've read in the docs, it is not trivial to have Dask workers load different versions of the same modules based on the connecting client without reloading the modules altogether.
Is it possible to have a shared Dask cluster to offload computations from apps using different versions of the same modules?
-- EDIT --
I've seen a related issue here: https://github.com/cloudpipe/cloudpickle/issues/206
and a PR here: https://github.com/cloudpipe/cloudpickle/pull/391