A better version of my last question :
I have blob data stored using LZW algorithm in oracle data base which I have to extract using Python and store in MS SQL Data warehouse. The algorithm works fine for smaller documents but it fails for larger blobs. So I was thinking of splitting the blob and then decompressing it and then combine it again. Do you think it is a feasible option and if yes then how it can be accomplished (I am completely new to Python). Any other suggestions are most welcome.
Smaller compressed blob size (7413 bytes) - Successfully decompressed into plain text
Bigger compressed Blob size (20801 bytes) - only some part of it gets decompressed