0

A better version of my last question :

I have blob data stored using LZW algorithm in oracle data base which I have to extract using Python and store in MS SQL Data warehouse. The algorithm works fine for smaller documents but it fails for larger blobs. So I was thinking of splitting the blob and then decompressing it and then combine it again. Do you think it is a feasible option and if yes then how it can be accomplished (I am completely new to Python). Any other suggestions are most welcome.

Smaller compressed blob size (7413 bytes) - Successfully decompressed into plain text

Bigger compressed Blob size (20801 bytes) - only some part of it gets decompressed

Wernfried Domscheit
  • 54,457
  • 9
  • 76
  • 110
Doodle
  • 481
  • 2
  • 7
  • 20
  • 1
    That probably won't work. And you should show us some code, preferably a [mcve]. What program was used to create the LZW blobs? And what are you using in Python to decompress them? There are various ways to do LZW compression, and the decompressor has to match the compressor. – PM 2Ring Jul 19 '18 at 23:52
  • Thanks for your reply. I don't have access to the program that was used to create LZW blobs. I'll post the code snippet – Doodle Jul 20 '18 at 00:00

0 Answers0