1

Error while running python script to get pixel points from google static map image. I got the python script from Google maps - how to get building's polygon coordinates from address? I use python2.7 to execute the script Initially when i was running the script i was not getting any error, but after continuous running for 3-4 hours i am getting the following error

Traceback (most recent call last):
File "pyscript.py", line 19, in <module>
imgBuildings = io.imread(urlBuildings)
File "/usr/local/lib/python2.7/dist-packages/skimage/io/_io.py", line 60, in i
with file_or_url_context(fname) as fname:
File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__
return self.gen.next()
File "/usr/local/lib/python2.7/dist-packages/skimage/io/util.py", line 29, in
u = urlopen(resource_name)
File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python2.7/urllib2.py", line 435, in open
response = meth(req, response)
File "/usr/lib/python2.7/urllib2.py", line 548, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python2.7/urllib2.py", line 473, in error
return self._call_chain(*args)
File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
result = func(*args)
File "/usr/lib/python2.7/urllib2.py", line 556, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 403: Forbidden

AS i am new to python i am not sure how to fix it ? Is this some kind of cache issue? Help very much appreciated.

Sunil
  • 11
  • 2
  • I've found this a lot recently when dealing with geolocation data that is downloaded from an external source. The code is not resilient enough to deal with the download issue you want. You must put the code inside a try,except since there is a chance every so often in the day that you will obtain a network drop and lose some packets. This is most likely all that is happening. In this case urllib2 downloads return an exception. So that is the error you are seeing. – Eamonn Kenny Sep 15 '17 at 11:58

1 Answers1

1

I've seen this problem quite a lot and its due to intermittent network drop errors. There is a recursive trick with try/catch exception handling that will avoid this ever happening, even if your network goes down for hours.

To explain: You attempt a download. If it fails, the download will attempt a recursive retry again 1/4,1/2,1,2,4,8,... seconds later, waiting up to 1 hour to get the next download. If you are working in a company for instance, the network might go down over the weekend, but your code will just poll for 1 hour (maximum) and then recover again when the network is fixed.

import time

def recursiveBuildingGetter( urlBuildings, waitTime=0.25 ):

  try:
    imgBuildings = io.imread(urlBuildings)
  except:
    print "Warning: Failure at time %f secs for %s" % ( waitTime, str(urlBuildings) )
    waitTime = waitTime * 2.0

    if ( waitTime >  3600.0 ):
      waitTime = 3600.0
    time.sleep(waitTime)

    imgBuildings = recursiveBuildingGetter( urlBuilding, waitTime )
    if ( waitTime == 3600.0 ):
      waitTime = 0.25

  return imgBuildings
Eamonn Kenny
  • 1,926
  • 18
  • 20