-1

I am getting zipped blob from db and using that blob in below way,

Ex:-

byte[] inputBlob = blobfile.getBytes(1, (int) blobfile.length());

After getting the blob, the way i got the zippedStream and passed it into another Class method(unzipper).

Ex:-

ByteArrayOutputStream zippedStream = null;
InputStream byteInputStream = null;
IParser parser = null;
byte[] buffer = null;
try {
        zippedStream = new ByteArrayOutputStream();
        byteInputStream = new ByteArrayInputStream(blob);
        blob = null;
        int bytes_read;
        buffer = new byte[byteInputStream.available()];

        while ((bytes_read = byteInputStream.read(buffer)) > 0) {
            zippedStream.write(buffer, 0, bytes_read);
        }
        buffer = null;
        byteInputStream.close();
        byteInputStream = null;

    }catch(Exception e){
        e.printStackTrace();
    }

unzipper method: Ex:-

byte[] buffer = new byte[1024];
try {
    InputStream decodedInput = new ByteArrayInputStream(zippedStream.toByteArray());
    zippedStream.close();
    zippedStream = null;
    GZIPInputStream unzippedStream = new GZIPInputStream(decodedInput);
    decodedInput.close();
    decodedInput = null;
    int bytes_read;
    unzippedOutputstream = new ByteArrayOutputStream();
    while ((bytes_read = unzippedStream.read(buffer)) > 0) {
        unzippedOutputstream.write(buffer, 0, bytes_read);
    }
    buffer = null;
    unzippedStream.close();
    unzippedStream = null;
} catch (Exception ex) {
    logger.setException(ex);
    logger.error("unzipper", generateMsg("Exception occurred"));
}

Using this way my application got stucked some time, and performance was so bad. Is there any optimize way to get the zippedstream file and unzipping that easily?

Erwin Bolwidt
  • 30,799
  • 15
  • 56
  • 79
Boo Balan
  • 51
  • 8

1 Answers1

0

Is all this buffering really needed. Can you IParser parse a Stream?

 InputStream zippedStream = ...
 IParser parser = ...
 parser.parse(new GZIPInputStream(zippedStream));

This will read compressed data, uncompressing as it goes which is much more efficient.

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130