I want to bulk-copy data from SQLite3 files to an Oracle DB, and do so programmatically from within a Python script using the jaydebeapi
module. (I have no control over the choice of Python + jaydebeapi to do this; it is imposed by the project I am collaborating in.)
One way to do it would be to dump the SQLite3 tables to temporary CSV files, and use Oracle's LOAD DATA INFILE
command to read the CSV files.
I am looking for a way to achieve the same end result that avoids creating the intermediate temporary files.
More specifically, since I can bulk-read the SQLite3 tables into memory (with simple SELECT
statements), what I need is the bulk-write counterpart to dump the tables from memory into the Oracle database.
EDIT: This is a recurrent task. The largest table to be copied has typically ~100K rows.