I try to get the big data from FRED (Federal Reserve Economic Data) using java but there are some problems. Belows are the sample codes.
for(US_STATE state : US_STATE.values()) { // for every us-states loop
String fredUrl = "https://api.stlouisfed.org/fred/series/search?search_text=Unemployment Rate in " + state.toString() + &api_key=**********&file_type=json";
ObjectMapper mapper = new ObjectMapper();
JsonNode rootNode = mapper.readTree(new URL(fredUrl)); // request url in this line
Thread.sleep(500); // insert sleep line to avoid the http errors
ArrayNode nodeSeriess = (ArrayNode)rootNode.get("seriess");
// I make the jsonnode stream , extract series_id from rootNode and make another url string
Stream<JsonNode> elementStream = StreamSupport.stream(nodeSeriess.spliterator(), false);
elementStream.flatMap(pojo -> {
String observUrl = "https://api.stlouisfed.org/fred/series/observations?series_id=" + pojo.getSeriesId() + "&api_key=***********&file_type=json";
// I request url again in the stream
JsonNode nodeValue = mapper.readTree(new URL(observUrl));
Thread.sleep(500); // insert sleep line to avoid the http errors
ArrayNode nodeValueObserv = (ArrayNode)nodeValue.get("observations");
...... // processing data from this line
});
As you see, my codes execute too much Java URL api. So I think I face the HTTP response code: 504 for URL.
I found with googling that the 504 error is not related with client codes, but when I execute too much Java URL, I always receive this error. I want to know how to avoid this error programmatically. I am afraid my Java program calls the Java URL too much in for loop.