Question
What is the Best way to update a column of a table of tens of millions of rows?
1) I saw creating a new table and rename the old one when finish
2) I saw update in batches using a temp table
3) I saw single transaction (don't like this one though)
4) never listen to cursor solution for a problema like this and I think it's not worthy to try
5) I read about loading data from file (Using BCP), but have not read if the performance is better or not. was not clear if it is just to copy or if it would allow join a big table with something and then bull copy.
really would like have some advice here.
Priority is performance
At the momment I'm testing solution 2) and Exploring solution 5)
Additional Information (UPDATE)
thank you for the critical thinking in here.
- The operation be done in downtime.
- UPDATE Will not cause row forwarding
- All the tables go indexes, average 5 indexes, although few tables got like 13 indexes.
- the probability of target column is present in one of the table indexes something like 50%.
- Some tables can be rebuilt and replace, others don't because they make part of a software solution, and we might lose support to those. from those tables some got triggers.
- I'll need to do this for more than 600 tables where ~150 range from 0.8 Million to 35 Million rows
- The update is always in the same column in the various fields
References