summary:
## parameters_string is a vector, 3K long, of different parameter values to the API
U <- paste0("www.Somehost.com/api?", parameters_string)
getURL(U)
eventually fails, after around 350 calls. Continues to work after restarting RgetURL(U, async=FALSE)
Works without error. Other than it is of course very slow.
Details:
I am hitting an API, iterating over several thousand urls The API has a counter indicating how many calls can still be made and I am beneath the limit.
The problem is that after some time, getURL()
fails, throwing an "Could not resolve host: Somehost.com
error.
I can access the url via any browser. Also, restarting R resolves the problem, as does setting the async=FALSE
flag in getURL
I've tried closeAllConnections()
but this did not help.
What is the best way to intermittently clean up whatever getURL()
is leaving open so that it can be used with async=TRUE
?