0

I have SQL Command task in SQL Server which is updating only 20 rows per second, but I need to update more than 200,000 rows which is taking time. When I am using SCD (Type 2) it is neither inserting or updating any records. (Not even giving any error)

Some batch of rows are getting transfered and SQL command task is becoming yellow.. though it is updating the columns but very slow (20 rows per second).

How can I increase the speed of update?

Zerotoinfinity
  • 6,290
  • 32
  • 130
  • 206

2 Answers2

3

Don't use the SQL Task. It performs singleton operations so you can expect to have 200,000 update statements issued against the target database. You might get some marginal boost if you created a stored procedure to avoid a few steps in the query compilation process but you'd have to test and see.

The real way to get a performance boost is to create a staging table and dump all the rows to be updated into that table. After the data flow has completed, then wire up an Execute SQL Task to perform batch update from the staging table to your target table.

Let me know if a picture would make that more clear

billinkc
  • 59,250
  • 9
  • 102
  • 159
  • That is a very good suggestion. +1 for that. But I have so many table in my database and for that I can not create suppliment/intermediate update table. Is that possible that I can store that output in the recordset and can use recordset as a source? – Zerotoinfinity Sep 12 '12 at 19:06
0

If using SQL 2008 or above try the MERGE statement (in an execute sql task) which works in a set based way rather than the RBAR (row by row) of a SQL Command

pjones
  • 1