Python Multiprocessing Queue on Parent Exit

♀尐吖头ヾ 提交于 2019-12-07 16:55:52

问题


The gist of my question is what happens to a multiprocessing queue when the parent (a daemon in this circumstance) is killed.

I have a daemon that runs in the background which queues up jobs for child processes:

class manager(Daemon):
    def run(self):
        someQueue = MP.Queue()

        someChild = MP.Process(target=someCode, args=(someArgs))
        someChild.start()
        ...

If the manager is killed (assuming it wasn't trying to use someQueue and therefore corrupted it as mentioned in the documentation), is there anyway to recover the data in the Queue?

Two theoretical solutions I see are cleaning up the someQueue in someChild before exiting this child process. Also dumping the queues so that I could restore the state of the queues when the manager exited would also solve my problem. However, before implementing either it would be nice to get nudged in the right direction.

Thanks,


回答1:


It sounds like you want persistent/reliable queuing. I believe the multiprocessing.Queue class is implemented with pipes (just like you would get with a popen() call), so the data is relatively transient and you'd probably have to do some OS-level trickery to grab the contents. You might look into writing your own persistent queue class that uses a filesystem file (assuming your OS and filesystem support locking) to store the queue contents. Then you can provide all of the analysis tools you desire to inspect the queue and recover unprocessed data.



来源:https://stackoverflow.com/questions/9085176/python-multiprocessing-queue-on-parent-exit

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!