Throttle pandas apply, when using an API call

╄→尐↘猪︶ㄣ 提交于 2019-11-29 05:17:33

Here is some tested code that may help. 1) Simple rate limiting to what the Api specifies (Nominatum appears to be 1 per second but i got success as low as 0.1 seconds). 2) Simple result caching in a dictionary, controllable by parameter for testing 3) Retry loop with multiplicative slowdown and linear speedup. (slows down fast, speeds up more slowly) 4) Test exception for faking errors

I cannot replicate the issues you are experiencing - likely due to your path to the API.

A more robust strategy may to build a local persistence cache and continue to retry until the full batch is built. The cache could be a pandas dataframe written as csv to file. The overall pseudo code is something like.

repeat until all addresses are in the cache
    cache = pd.read_csv("cache.csv)
    addressess_to_get = addresses in df that are not in cache
    for batch of n addresses in addresses_to_get:
       cache.add(get_location(addr))
    cache.write_csv("cache.csv")

Here is the tested code

import datetime
import time

import pandas as pd
from geopy.geocoders import Nominatim
geo_locator = Nominatim(user_agent="notarealemail@gmail.com")


# Define the rate limit function and associated global variable

last_time = datetime.datetime.now()
backoff_time = 0

def rate_limit(min_interval_seconds = .1):
    global last_time
    sleep = min_interval_seconds - (datetime.datetime.now() - last_time).total_seconds() 
    if sleep > 0 :
        print(f'Sleeping for {sleep} seconds')
        time.sleep(sleep)
    last_time = datetime.datetime.now()

# make a cache dictionary keyed by address 
geo_cache = {}
backoff_seconds = 0

def get_coordinates_with_retry(addr):

    # Return coords from global cache if it exists
    global backoff_seconds


    # set the backoff intital values and factors
    max_backoff_seconds = 60
    backoff_exponential = 2
    backoff_linear = 2

    # rate limited API call
    rate_limit()

    # Retry until max_back_seconds is reached

    while backoff_seconds < max_backoff_seconds:   # backoff up to this time
        if backoff_seconds > 0:
            print(f"Backing off for {backoff_seconds} seconds.")
            time.sleep(backoff_seconds)
        try:
            location = geo_locator.geocode(addr)

            # REMOVE THIS: fake an error for testing
            #import random
            #if random.random() < .3:
            #    raise(Exception("Fake exception for testing"))

            # Success - so reduce the backoff linearly
            print (f"Fetched {location} for address {addr}")
            backoff_seconds = backoff_seconds - backoff_linear if backoff_seconds > backoff_linear else 0
            break

        except Exception as e:
             print(f"Exception from geolocator: {e}")
             # Backoff exponentially 
             backoff_seconds = 1 + backoff_seconds * backoff_exponential

    if backoff_seconds > max_backoff_seconds:
        raise Exception("Max backoff reached\n")

    return(location)

def get_coordinates(addr, useCache = True):

    # Return from cache if previously loaded
    global geo_cache
    if addr in geo_cache:
        return  geo_cache[addr]

    # Attempt using the full address
    location = get_coordinates_with_retry(addr)

    # Attempt using the first part only if None found
    if location is not None:
        result = pd.Series({'lat': location.latitude, 'lon': location.longitude})
    else :
        print (f"Trying split address for address {addr}")
        location = get_coordinates_with_retry(addr.split(',')[0])
        if location is not None:
            result =  pd.Series({'lat': location.latitude, 'lon': location.longitude})
        else:
            result = pd.Series({'lat': -1, 'lon': -1})

    # assign to cache
    if useCache:
        geo_cache[addr] = result
    return(result)

# Use the test data

df = pd.DataFrame({'addr' : [
'IN,Krishnagiri,635115',  
'IN,Chennai,600005',
'IN,Karnal,132001',
'IN,Jaipur,302021',
'IN,Chennai,600005']})

# repeat the test data to make alarger set 

df = pd.concat([df, df, df, df, df, df, df, df, df, df])

df.addr.apply(get_coordinates)
print(f"Address cache contains {len(geo_cache)} address locations.")
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!