I am trying to convert a file that has been downloaded into a byte[] and return it. The maximum size I can convert without it failing is around 70mb, which is the amount of free memory on my ubuntu instance. It would be unrealistic to need the file size worth of RAM to be able to download it.
I have tried a BufferedInputStream into a ByteArrayOutputStream but it runs out of memory the moment it begins writing. In the code below it will make it to "Starting buffered write" before it stops.
FileInputStream fis = null;
BufferedInputStream bis = null;
byte[] bytes = null;
byte[] buffer = new byte[1024];
int count = 0;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try
{
fis = new FileInputStream(file);
bis = new BufferedInputStream(fis);
System.out.println("Starting buffered write");
while((count=bis.read(buffer)) != -1)
{
bos.write(buffer, 0, count);
}
System.out.println("Finished buffered write");
//System.out.println("Trying copyLarge");
//IOUtils.copyLarge(fis, bos);
//System.out.println("Successful copyLarge");
System.out.println("Starting stream to bytes");
bytes = bos.toByteArray();
System.out.println("Finished stream to bytes");
fis.close();
bis.close();
bos.flush();
bos.close();
}
Something confusing me is that this method works for the upload function with large files no problem. The upload creates a temp file and an output stream from it and then writes the uploaded files input stream into it. Is it possible that since upload temp files are not being deleted after use, they are using up my instances memory?