I am currently inserting data from a file into a table. I format the data before inserting it into the table (Remove tabs/carriage returns/get char codes for certain text etc). Firstly I have to check if a record exists and update it if it does. If it doesn't that record must be inserted into the table.
The problem I am having is it takes an extremely long time. I have two files, one with just below 500k records and one with above 4mil. If it possible to run a bulk collect so that I does both?
E.G.
open c_cur;
loop
fetch c_cur bulk collect into examp limit 50000;
exit when limit.count = 0
forall x in 1..limit.count
update table1
set...
where...
if sql%notfound then
insert into table1
values (...)
end if;
commit;
end loop;
close c_cur;
Is this possible?
I also heard about setting up a staging table and dumping the file in there and then working from there. Is that the better option? How much of a performance increase would that give me?
I would also prefer to not do a merge on a table from a file.
Thanks.