How to speed up API requests?

后端 未结 5 1560
没有蜡笔的小新
没有蜡笔的小新 2020-12-16 02:31

I\'ve constructed the following little program for getting phone numbers using google\'s place api but it\'s pretty slow. When I\'m testing with 6 items it takes anywhere fr

相关标签:
5条回答
  • 2020-12-16 02:43

    You may want to send requests in parallel. Python provides multiprocessing module which is suitable for task like this.

    Sample code:

    from multiprocessing import Pool
    
    def get_data(i):
        r1 = requests.get('https://maps.googleapis.com/maps/api/place/textsearch/json?query='+ i +'&key=MY_KEY')
        a = r1.json()
        pid = a['results'][0]['place_id']
        r2 = requests.get('https://maps.googleapis.com/maps/api/place/details/json?placeid='+pid+'&key=MY_KEY')
        b = r2.json()
        phone = b['result']['formatted_phone_number']
        name = b['result']['name']
        website = b['result']['website']
        return ' '.join((phone, name, website))
    
    if __name__ == '__main__':
        terms = input("input places separated by comma").split(",")
        with Pool(5) as p:
            print(p.map(get_data, terms))
    
    0 讨论(0)
  • 2020-12-16 02:46

    its a matter of latency between client and servers , you can't change anything in this way unless you use multiple server location ( the near server to the client are getting the request ) .

    in term of performance you can build a multithreding system that can handel multiple requests at once .

    0 讨论(0)
  • 2020-12-16 02:49

    Most of the time isn't spent computing your request. The time is spent in communication with the server. That is a thing you cannot control.

    However, you may be able to speed it along using parallelization. Create a separate thread for each request as a start.

    from threading import Thread
    
    def request_search_terms(*args):
        #your logic for a request goes here
        pass
    
    #...
    
    threads = []
    for st in searchTerms:
        threads.append (Thread (target=request_search_terms, args=(st,)))
        threads[-1].start()
    
    for t in threads:
        t.join();
    

    Then use a thread pool as the number of request grows, this will avoid the overhead of repeated thread creation.

    0 讨论(0)
  • 2020-12-16 02:59

    There is no need to do multithreading yourself. grequests provides a quick drop-in replacement for requests.

    0 讨论(0)
  • 2020-12-16 03:02

    Use sessions to enable persistent HTTP connections (so you don't have to establish a new connection every time)

    Docs: Requests Advanced Usage - Session Objects

    0 讨论(0)
提交回复
热议问题