Creating a timeout function in Python with multiprocessing

半世苍凉 提交于 2021-01-28 13:35:42

问题


I'm trying to create a timeout function in Python 2.7.11 (on Windows) with the multiprocessing library.

My basic goal is to return one value if the function times out and the actual value if it doesn't timeout.

My approach is the following:

from multiprocessing import Process, Manager

def timeoutFunction(puzzleFileName, timeLimit):
  manager = Manager()
  returnVal = manager.list()

  # Create worker function
  def solveProblem(return_val):
    return_val[:] = doSomeWork(puzzleFileName) # doSomeWork() returns list

  p = Process(target=solveProblem, args=[returnVal])
  p.start()

  p.join(timeLimit)
  if p.is_alive():
    p.terminate()
    returnVal = ['Timeout']

  return returnVal

And I call the function like this:

if __name__ == '__main__':
  print timeoutFunction('example.txt', 600)

Unfortunately this doesn't work and I receive some sort of EOF error in pickle.py

Can anyone see what I'm doing wrong?

Thanks in advance,
Alexander

Edit: doSomeWork() is not an actual function. Just a filler for some other work I do. That work is not done in parallel and does not use any shared variables. I'm only trying to run a single function and have it possibly timeout.


回答1:


You can use the Pebble library for this.

from pebble import concurrent
from concurrent.futures import TimeoutError

TIMEOUT_IN_SECONDS = 10

@concurrent.process(timeout=TIMEOUT_IN_SECONDS)
def function(foo, bar=0):
    return foo + bar

future = function(1, bar=2)

try:
    result = future.result()  # blocks until results are ready or timeout
except TimeoutError as error:
    print "Function took longer than %d seconds" % error.args[1]
    result = 'timeout'

The documentation has more complete examples.

The library will terminate the function if it timeouts so you don't need to worry about IO or CPU being wasted.

EDIT:

If you're doing an assignment, you can still look at its implementation.

Short example:

from multiprocessing import Pipe, Process

def worker(pipe, function, args, kwargs):
    try:
        results = function(*args, **kwargs)
    except Exception as error:
        results = error

    pipe.send(results)

pipe = Pipe(duplex=False)
process = Process(target=worker, args=(pipe, function, args, kwargs))

if pipe.poll(timeout=5):
    process.terminate()
    process.join()
    results = 'timeout'
else:
    results = pipe.recv()

Pebble provides a neat API, takes care of corner cases and uses more robust mechanisms. Yet this is more or less what it does under the hood.




回答2:


The problem seems to have been that the function solveProblem was defined inside my outer function. Python doesn't seem to like that. Once I moved it outside it worked fine.

I'll mark noxdafox answer as an answer as I implementing the pebble solution led me to this answer.

Thanks all!



来源:https://stackoverflow.com/questions/37098360/creating-a-timeout-function-in-python-with-multiprocessing

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!