I have a database-like object containing many dask dataframes. I would like to work with the data, save it and reload it on the next day to continue the analysis.
Therefore, I tried saving dask dataframes (not computation results, just the "plan of computation" itself) using pickle. Apparently, it works (at least, if I unpickle the objects on the exact same machine) ... but are there some pitfalls?