python multiple threaded processes for running executables

北城以北 提交于 2019-12-06 14:12:25

I don't think GIL by itself wouldn't kill this by itself unless opening a file gets you into some weird deadlock or spinlock condition. In general, you want threads in cases like this where you're I/O-bound. In fact, the fact that the threads are able to run concurrently probably contributes to the other threads failing rather than successfully opening a file several times.

On slide fifteen of this presentation, the author points out that the GIL releases on blocking I/O calls to give other threads a chance.

The real problem here seems to be a lock on a file resource. I'm not really sure about how Windows works, so I can't speak to why this error is creeping up, but it seems like only one thread actually has a lock on a file resource.

The other poster's point about multiple cores and the GIL might be coming into play, in that you could have some sort of priority inversion going on where the other two threads are getting starved, but I find it unlikely given that the above presentation says that threads in the middle of a blocking operation free the lock for other threads.

One thought is to try multiprocessing. I suspect you'll have better luck with reading the file across multiple processes rather than with threads.

Here is an example I wrote and tried on my OS 10.7.3 machine, it opens up a file test whose contents are lol\n:

import multiprocessing
import os

def open_file(x):
   with open(x, 'r') as file_obj:
     return file_obj.readlines()

a = multiprocessing.Pool(4)
print a.map(open_file, ['test']*4)

Here's the result when I execute it:

➜  ~ git:(master) ✗ python open_test.py
[['lol\n'], ['lol\n'], ['lol\n'], ['lol\n']]

Python cannot currently exploit multiple cores, because of the Global Interpreter Lock. Multithreading tends to be fraught with trouble, anyway—better to use multiple processes if you can.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!