0

I am trying read data from one table and write into another table in another database using Python and cx_Oracle.

My script works fine when I read the records one by one but when I fetch more than 100 records using fetchmany, the script fails with the error

cx_Oracle.NotSupportedError: Python value cannot be converted to a database value

This is the script I am trying to run.

src_conn = cx_Oracle.connect(username+'/'+password+'@'+database)
src_cursor=src_conn.cursor()
tgt_conn = cx_Oracle.connect(tgt_username+'/'+tgt_password+'@'+tgt_database)
tgt_cursor=tgt_conn.cursor()
for files in file_path:
    cols=[]
    col_id=[]
    file=files.replace("\n","")
    column_list_query="select COLUMN_NAME,COLUMN_ID from all_tab_columns where owner='GDW' and table_name ='"+file+"' order by column_id"
    col=src_cursor.execute(column_list_query)
    col_list=col.fetchall()

    for value in col_list:
        cols.append(value[0])
        col_id.append(":"+str(value[1]))

    col_names="("+','.join(cols)+")"
    col_ids="("+','.join(col_id)+")"

    insert_statement='INSERT INTO '+file+' '+col_names+' VALUES '+col_ids
    select_statment="SELECT * FROM "+file
    src_cursor.bindarraysize=1000
    src_values=src_cursor.execute(select_statment)
    print("Copy contents into table :"+file)


    def ResultIter(cursor, arraysize=500):
        while True:
            results = src_cursor.fetchmany(arraysize)
            if not results:
                break
            yield results

    tgt_cursor.prepare(insert_statement)
    for result in ResultIter(src_values):

        if not result:
            break
        tgt_cursor.executemany(None,result)
        tgt_conn.commit()
Shubham
  • 2,847
  • 4
  • 24
  • 37
  • If you're reading and writing to the same database, you almost certainly want to do this in SQL or PL/SQL so that the data doesn't have to come across the network and back into the database. – Christopher Jones Jun 07 '18 at 00:11
  • I am copying from one database to another. When i used sqlplus or toad, i had issues with certain datasets and found that i can overcome them using this script. – Veeraputhiran Jun 07 '18 at 07:54
  • Can you post what the result looks like when this failure occurs? I've tried to replicate your issue but haven't been able to do so thus far. – Anthony Tuininga Jun 07 '18 at 15:03

1 Answers1

0

The problem has been reported here and you can take a look at the comments there.

The reason for the error is that in the first batch one or more of the columns being bound are always None, but in a subsequent batch one of those columns now has a value. This has been corrected in the cx_Oracle source so you can build yourself or you can wait for a patch release to be made.

Otherwise, the current solutions are as follows:

(1) perform all inserts in a single batch (but depending on size this might not be usable)

(2) create a new cursor for each batch (so that cx_Oracle calculates the types for each batch)

(3) use cursor.setinputsizes() to specify the types and sizes (may be cumbersome)

A simple test case that demonstrates the problem is as follows:

import cx_Oracle

conn = cx_Oracle.connect("cx_Oracle/welcome")
cursor = conn.cursor()

cursor.execute("truncate table TestTempTable")

sql = "insert into TestTempTable values (:1, :2)"
cursor.executemany(sql, [(1, None), (2, None)])
cursor.executemany(sql, [(3, None), (4, "Testing")])
Anthony Tuininga
  • 6,388
  • 2
  • 14
  • 23