python parallel no space cant pickle

孤者浪人 提交于 2019-12-11 01:14:03

问题


I am using Parallel from joblib in my python to train a CNN. the code structure is like:

crf = CRF()
with Parallel(n_jobs=num_cores) as pal_worker:
    for epoch in range(n):
        temp = pal_worker(delayed(crf.runCRF)(x[i],y[i]) for i in range(m))

The code can run successfully for 1 or 2 epoch, the then an error occured says (I list the main point I think matters):

......
File "/data_shared/Docker/tsun/software/anaconda3/envs/pytorch04/lib/python3.5/site-packages/joblib/numpy_pickle.py", line 104, in write_array
pickler.file_handle.write(chunk.tostring('C'))
OSError: [Errno 28] No space left on device
"""
The above exception was the direct cause of the following exception:

return future.result(timeout=timeout)
File
......
_pickle.PicklingError: Could not pickle the task to send it to the workers.

I am confused since the disk has a lot of space and the program can run successfully for 1 or 2 epoch. I also tried :

with Parallel(n_jobs=num_cores,temp_folder='/dev/shm/temp') as pal_worker:

since '/dev/shm/temp' has a lot of space but it does not work. Could anyone help please? Thanks a lot!

来源:https://stackoverflow.com/questions/52166572/python-parallel-no-space-cant-pickle

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!