Here’s my situation. There’s an imported table that has about a million records. I need to perform a lot of updates, inserts, etc. to this and other tables based on what is in each record.
I could do this using a few set-based SQL statements. However, the DBA’s don’t want me to use that approach being that a lot of the actions are touching tables that are used a lot and they don’t want me locking a lot of records at once.
However I don’t want to do a cursor line-by-line approach.
I would like to test out doing a batch of hundred rows at a time as follows:
Add a batch_number field to the imported table and populate it with an incremented integer for every 100 rows.
Then loop from 1 thru max batch_number. In that loop I’ll use SQL set-based ETL approach with an additional WHERE clause in each statement that has: WHERE batch_number = loop-number.
Is this a sound approach or is there an alternative better one?