1

Here’s my situation. There’s an imported table that has about a million records. I need to perform a lot of updates, inserts, etc. to this and other tables based on what is in each record.

I could do this using a few set-based SQL statements. However, the DBA’s don’t want me to use that approach being that a lot of the actions are touching tables that are used a lot and they don’t want me locking a lot of records at once.

However I don’t want to do a cursor line-by-line approach.

I would like to test out doing a batch of hundred rows at a time as follows:

Add a batch_number field to the imported table and populate it with an incremented integer for every 100 rows.

Then loop from 1 thru max batch_number. In that loop I’ll use SQL set-based ETL approach with an additional WHERE clause in each statement that has: WHERE batch_number = loop-number.

Is this a sound approach or is there an alternative better one?

user1234795
  • 113
  • 1
  • 6
  • Might wanna look at cursor BULK COLLECT ... LIMIT thang – Gurwinder Singh Dec 12 '16 at 17:30
  • Thanks - I'll take a look into that – user1234795 Dec 12 '16 at 18:15
  • 1
    Sounds like you need to get some new DBAs. They ought to understand how Oracle databases work, and the importance of transactions and read consistency. They should not be telling you to use an approach which will (not may) introduce data anomalies, not to mention the performance concerns. – APC Dec 13 '16 at 08:26
  • Doesn't sound like a bad solution, do not forget to keep track what batches you have processed/commited so if your job fails, so it can restart from where it left. – alexs Dec 13 '16 at 10:10

0 Answers0