2

Right now I am using urllib to pull some data off a server. However, the server is a bit dodgy and tends to go down every now and then for a minute or so. To deal with, when my code encounters an error, it just waits two seconds and tries again:

def fin(group):
try:
    data = urllib2.urlopen("cool website" + group)
    return data.read()
except urllib2.HTTPError, err:
    time.sleep(2)
    fin(group) #calls itself again
except urllib2.URLError, err:
    time.sleep(2)
    fin(group)

That works fine if the website goes down or I lose my internet connection. However, last night I left the code running and got this error:

socket.error: [Errno 10053] An established connection was aborted by the software in your host machine 

I am not quite sure how to catch that. After some searching I am thinking I may need to do this:

except httplib.HTTPException, err:
    time.sleep(2)
    fin(group)

But I am not certain. Would anyone be able to help me out?

user2351418
  • 381
  • 3
  • 5
  • 13
  • what's with the time.sleeps in the exception handlesr? – Corey Goldberg Dec 16 '13 at 00:41
  • Try catching the `socket.error` stated here? http://docs.python.org/2/library/socket.html#socket.error Maybe this would help a bit: http://stackoverflow.com/questions/14425401/catch-socket-error-errno-111-connection-refused-exception – yanhan Dec 16 '13 at 01:18

0 Answers0