4

I have read through several websites that showing the sample code to convert java.sql.Blob object to byte[] and finally saved as a file.

e.g. easiest-way-to-convert-a-blob-into-a-byte-array or image-to-bytearray-to-blob-and-blob-to-bytearray-to-image-conversion-issues-in-j

Most of the suggestion use blob.getBytes(pos, length) i.e. byte[] blobBytes = blob.getBytes(1, (int) blob.length());

However if the size of the blob more than the maximum value of integer(e.g. 3 GB), the blobBytes object will be truncated and the created file will be malformed

Is there anyway to overcome the size limitation of that?

or is it I have to use blob.getBinaryStream(1, blob.length()) to get InputStream and further process it to byte[]?

Community
  • 1
  • 1
Vince
  • 125
  • 1
  • 6
  • Why would you even want to have 3 GB in memory? So - yes - you should stream the data and work only on portions of the data. And - yes - this is what the `getBinaryStream` method is for. – Seelenvirtuose Dec 08 '15 at 08:53
  • Which database do you use ? – Arnaud Dec 08 '15 at 08:59
  • @Berger, I'm using Oracle DB. I checked the blob size can be supported is up to 4GB-1. – Vince Dec 08 '15 at 09:00
  • @Seelenvirtuose, my question arise from a code scan result which highlighted that I downcast a 'long' to 'int' which is used in `blob.getBytes(1, (int) blob.length());`. That is a method in my application API which suppose to read a blob from DB and output it as a byte[]. – Vince Dec 08 '15 at 09:12

2 Answers2

1

The easiest way is to use Apache IOUtils (which you need to include in your project) and copy the stream to disk.

InputStream blobInputStream = blob.getBinaryStream();
OutputStream fileOutputStream = new FileOutputStream(new File("/path/to/file.out"));
IOUtils.copyLarge(blobInputStream, fileOutputStream);

copyLarge(InputStream input, OutputStream output) Copies bytes from a large (over 2GB) InputStream to an OutputStream.

Edit: You could use blob.getBytes as well, but you would need to pull out a reasonable amount of bytes (e.g. 8192) at a time in a loop, and then write to a FileOutputStream.

Erik Živković
  • 4,867
  • 2
  • 35
  • 53
  • I'll accept this answer as it has directed me to the right answer I want. The maximum size of byte[] can accept is integer. The code will getting exception when exceeding the upper limit, therefore want to convert 3GB blob into byte[] is impossible. The only way to fully process the blob is using InputStream. Thanks – Vince Dec 08 '15 at 09:25
0

Oracle provides a method to get an inputstram, from an oracle.sql.Blob :

See :

Reading and Writing BLOB and CLOB Data

Arnaud
  • 17,229
  • 3
  • 31
  • 44
  • I suppose the method you suggested is Blob.getBinaryStream(long pos, long length)? Actually I just wanted to find out if InputStream is the only way to handle those kind of big Blob object so that I can make decision on how to modified the API. Thanks – Vince Dec 08 '15 at 09:17