Dask: is it safe to pickle a dataframe for later use?

泄露秘密 提交于 2019-12-12 18:02:22

问题


I have a database-like object containing many dask dataframes. I would like to work with the data, save it and reload it on the next day to continue the analysis.

Therefore, I tried saving dask dataframes (not computation results, just the "plan of computation" itself) using pickle. Apparently, it works (at least, if I unpickle the objects on the exact same machine) ... but are there some pitfalls?


回答1:


Generally speaking it is usually safe. However there are a few caveats:

  1. If your dask.dataframe contains custom functions, such as with with df.apply(lambda x: x) then the internal function will not be pickleable. However it will still be serializable with cloudpickle
  2. If your dask.dataframe contains references to files that are only valid on your local computer then, while it will still be serializable the re-serialized version on another machine may no longer be useful
  3. If your dask.dataframe contains dask.distributed Future objects, such as would occur if you use Executor.persist on a cluster then these are not currently serializable.
  4. I recommend using a version >= 0.11.0.


来源:https://stackoverflow.com/questions/39147120/dask-is-it-safe-to-pickle-a-dataframe-for-later-use

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!