Python Requests: Don't wait for request to finish

前端 未结 5 1785
遇见更好的自我
遇见更好的自我 2020-12-05 06:55

In Bash, it is possible to execute a command in the background by appending &. How can I do it in Python?

while True:
    data = raw_input(\         


        
5条回答
  •  挽巷
    挽巷 (楼主)
    2020-12-05 07:44

    I use multiprocessing.dummy.Pool. I create a singleton thread pool at the module level, and then use pool.apply_async(requests.get, [params]) to launch the task.

    This command gives me a future, which I can add to a list with other futures indefinitely until I'd like to collect all or some of the results.

    multiprocessing.dummy.Pool is, against all logic and reason, a THREAD pool and not a process pool.

    Example (works in both Python 2 and 3, as long as requests is installed):

    from multiprocessing.dummy import Pool
    
    import requests
    
    pool = Pool(10) # Creates a pool with ten threads; more threads = more concurrency.
                    # "pool" is a module attribute; you can be sure there will only
                    # be one of them in your application
                    # as modules are cached after initialization.
    
    if __name__ == '__main__':
        futures = []
        for x in range(10):
            futures.append(pool.apply_async(requests.get, ['http://example.com/']))
        # futures is now a list of 10 futures.
        for future in futures:
            print(future.get()) # For each future, wait until the request is
                                # finished and then print the response object.
    

    The requests will be executed concurrently, so running all ten of these requests should take no longer than the longest one. This strategy will only use one CPU core, but that shouldn't be an issue because almost all of the time will be spent waiting for I/O.

提交回复
热议问题