2

I have been trying to get the blob data from oracle into a text file using Python. I couldn't find the answer on any of the other links.

Below is my code :

sql_string = """select 
   event_id
   ,blob_length
   ,blob field
from table"""

  cur.execute(sql_string)
    path = "P:/Folders/"

    for row in cur:
        filename = path +  "notes_" + str(row[0]) + "_" + str(row[1]) + ".txt"      
        f = codecs.open(filename, encoding='utf-8', mode='wb+')
        f.write(row[2])
        f.close()

I get the below error

TypeError: utf_8_encode() argument 1 must be str, not cx_Oracle.LOB

I have tried a few other ways but the problem is that even other approaches that I've seen only handle strings and not blobs.

user2906838
  • 1,178
  • 9
  • 20
Doodle
  • 481
  • 2
  • 7
  • 20

3 Answers3

3

You have to use the cx_oracle.LOB.read() method to get the content of the LOB object:

f.write(row[2].read())
blhsing
  • 91,368
  • 6
  • 71
  • 106
3

Implemented what @blhsing suggested and it worked out perfectly

    for row in cur:
        filename = path +  "notes_" + str(row[0]) + "_" + str(row[1]) + ".txt"      
        print(row[2].read())
        f = open(filename, "wb")
        f.write(row[2].read())
        f.close()        
Doodle
  • 481
  • 2
  • 7
  • 20
2

If your LOBs are small enough to fit in memory you'll get better performance if you fetch BLOBs as LONG_BINARY (and use LONG_STRING for CLOBs):

def OutputTypeHandler(cursor, name, defaultType, size, precision, scale):
    if defaultType == cx_Oracle.CLOB:
        return cursor.var(cx_Oracle.LONG_STRING, arraysize = cursor.arraysize)
    if defaultType == cx_Oracle.BLOB:
        return cursor.var(cx_Oracle.LONG_BINARY, arraysize = cursor.arraysize)

See the cx_Oracle example at https://github.com/oracle/python-cx_Oracle/blob/master/samples/ReturnLobsAsStrings.py

Christopher Jones
  • 9,449
  • 3
  • 24
  • 48
  • Thanks Christopher. I was just reading the comments and it says that this method requires contiguous memory so is not usable for very large LOBs. And I am already facing a problem with large blobs. And I was was trying writing to text files so I could uncompress the complete blob which currently I am unable to do with dataframes or text file. https://stackoverflow.com/questions/51922686/unable-to-to-decompress-large-blobs-in-python?noredirect=1#comment90795261_51922686 Any help/suggestions would be great!! – Doodle Aug 20 '18 at 03:30