I just came across this powerful R package but unfortunately haven't been able to find out how to parse a list of urls in parallel where the response is in JSON.
As a simple example, suppose I have a list of cities (in Switzerland):
list_cities <- c("Winterthur", "Bern", "Basel", "Lausanne", "Lugano")
In a next step I'd like to find public transport connections to the city of Zurich for each of the listed cities. I can use the following transport api to query public timetable data:
https://transport.opendata.ch
Using the httr package, I can make a request for each city as follows:
for (city in list_cities) {
r <- GET(paste0("http://transport.opendata.ch/v1/connections?from=", city, "&to=Zurich&limit=1&fields[]=connections/duration"))
cont <- content(r, as = "parsed", type = "application/json", encoding = "UTF-8")
}
to get the duration of the individual journeys. However, I have a much longer list and more destinations. That's why I am looking for a way to make multiple requests in parallel.