i'm writing a simple API that read a file from a remote server and return it to the caller using Javalin.
This is my simple code:
class FileReader{
public []byte getFileBytes(String fileURL){
...
HttpURLConnection con = null;
try {
URL url = new URL(fileURL);
con = (HttpURLConnection) url.openConnection();
con.setRequestMethod("GET");
...
int status = con.getResponseCode();
if (status != 200) {
//manage errors
} else {
return con.getInputStream().readAllBytes();
}
}catch(Exception e){
//manage exceptions
}finally {
if(con != null) {
con.disconnect();
}
}
}
}
public void handle(Context ctx) throws Exception {
...
[]byte fileBytes = fileReader.getFileBytes("https://..../myfile");
ctx.header(Header.CONTENT_TYPE, "application/octet-stream");
ctx.result(fileBytes);
}
Everithing seems to be ok but the problem is that every file is first downloaded into memory before returning to the original caller. I was wondering if there are some more clever ways to do this, such as reading the bytes into a buffer (a chunk at a time) from the inputstream and write them in the javalin response so that the memory is only used for the chunk length... I couldn't find a way to do this.
Also, what I've seen is that javalin has a method ctx.result that takes an inputstream and I was hoping to try it and see if the memory used was less, but doing something like the following, results in a IOException "stream is closed" error
public void handle(Context ctx) throws Exception {
HttpURLConnection con = null;
try {
URL url = new URL(fileURL);
con = (HttpURLConnection) url.openConnection();
con.setRequestMethod("GET");
...
int status = con.getResponseCode();
if (status != 200) {
//manage errors
} else {
ctx.header(Header.CONTENT_TYPE, "application/octet-stream");
ctx.result(con.getInputStream());
}
}catch(Exception e){
//manage exceptions
}finally {
if(con != null) {
con.disconnect();
}
}
}
If you have any hints or ideas i would be grateful. Thanks