I need to make a batch process in java that will read a file of a variable amount of records (though can safely assume 5k+), treat them and then insert the records into an oracle 11g database. I need to do it via call(s) to a stored procedure.
I have made a similar process already some months ago, which I am not too proud of (for no particular reason, other than I am sure it is not optimal at all). The previous solution I kept everything in memory and in order to insert the rows I created an INSERT ALL statement in which I just appended all the records via a stringbuilder, and just executed that one statement to insert 15k+ records in one go. This time I need to use a stored procedure for the insertion of the data. I have been reading and now know there are ways to send arrays of data to stored procedures, so I would be able to send multiple records at a time.
Should I make a Stored procedure that receives arrays and just send all - potentially thousands - of the records in one single call to that SP? Or should I limit it to a certain amount of records at a time and call that SP (records/limit) amount of times?
Or should I stay away from using arrays, and have a stored procedure that just receives the information for one record, and call it as many times as I have records?
If I were to do multiple calls, I was thinking of utilizing PreparedStatements and the .addBatch() and .executeBatch() methods, would this be the way to go?
I need to be able to insert all the records, and rollback in case of any error. For which I am going to use transactions. I am technically not required to meet any threshold in terms of performance, but I am intersted in the topic and this could be a good time to start worrying more about it, so I would like some pointers and tips from someone with experience in the topic.