1

The following is a pattern I have for doing simultaneous requests:

rs = (grequests.get(url) for url in urls)
res_items = grequests.map(rs)
for num, res in enumerate(res_items):
    json_data = json.loads(res.text)

However, this crashes with the error ConnectionError: HTTPConnectionPool(host='apicache.vudu.com', port=80): Max retries exceeded with url: about every 5,000 requests. What would be a more reliable patter for doing the above -- for example, retrying the url for up to five times if the individual request doesn't work?

David542
  • 104,438
  • 178
  • 489
  • 842

1 Answers1

1

Here is one option, using exponential backoff as described here:

def grequester(self, url, n=1):
    '''
    Google exponential backoff: https://developers.google.com/drive/web/handle-errors?hl=pt-pt
    '''
    MAX_TRIES = 8
    try:
        res = grequests.get(url)
    except:
        if n > MAX_TRIES:
            return None
        n += 1
        log.warning('Try #%s for %s...' % (n, url))
        time.sleep((2 ** n) + (random.randint(0, 1000) / 1000.0))  # add jitter 0-1000ms
        return self.grequester(url, n)
    else:
        return res
Alexander Patrikalakis
  • 5,054
  • 1
  • 30
  • 48
David542
  • 104,438
  • 178
  • 489
  • 842