0

I have a package in SSIS with multiple task. I am loading file,if the Filesystem task at the end fail i want to be able to rollback the transaction.My package look like that.general_package I like to be able to rollback all the operations the SSIS script have been done.To do that i need the SSIS script to enlist in the transaction created by the BEGIN_TRANSACTION Sql task.How could i do that ?

In ssis to gather the transaction i do :

 object rawConnection = Dts.Connections["destination_ado"].AcquireConnection(Dts.Transaction);
             myADONETConnection = (SqlConnection)rawConnection;

Then i do a BulkCopy:

using (SqlBulkCopy sbc = new SqlBulkCopy(myADONETConnection))
                    {
                        sbc.DestinationTableName = "[" + SCHEMA_DESTINATION + "].[" + TABLE_DESTINATION + "]";
                        //  sbc.DestinationTableName = "test_load";
                        // Number of records to be processed in one go
                        sbc.BatchSize = 10000;


                        // Finally write to server
                        sbc.WriteToServer(destination);
                    }
                    myADONETConnection.Close();

How do I tell the SqlBulkCopy to use the existing transaction ? In the options of the connection in SSIS i use RetainSameConnection:true

Thanks for all your thought

Vincent

2 Answers2

0

Looking at your package, I see that you are iterating through bunch of files and for each iteration you are loading the files content into your destination tables. You want all your data loads to be atomic i.e. fully loaded or none at all.

With this in mind I would like to suggest the following approaches and in all of these appraoches there is no need of using Script Task or Begin/End Transaction blocks explicitly -

  1. Use a Data Flow Task and in the properties set TransactionOption to Required. This will do the job of enabling the transaction on the block
  2. Have a error-redirection at the destination to a error table in a batch-wise manner so as to minimize the errors to the lowest minimum possible (such as -http://agilebi.com/jwelch/2008/09/05/error-redirection-with-the-ole-db-destination/). We used 100k, 50k, 1 as 3 batches successfully when doing data loads of over million per day. You can then deal with those errors seperately.
  3. If the use case is such that the whole data has to fail then just redirect the failed records. Move the record to 'failed' folder using File System Task(FST). Have a DFT following the FST to perform a lookup on the destination and then deleting all those records.
VKarthik
  • 1,379
  • 2
  • 15
  • 30
  • Hi,Thanks for your answer the reason i do don't use the dataflow is the package is use for different table with different structure and a dataflow can't have generic metadata for columns – Vincent Diallo-Nort Nov 14 '16 at 21:48
  • Could you give more details or example on what you mean by different table with different structure? What use case are you trying to solve? – VKarthik Nov 14 '16 at 22:34
0

So i found a solution. On the first Script block (extract and load )i create a transaction with this code:

SqlTransaction tran = myADONETConnection.BeginTransaction(IsolationLevel.ReadCommitted);

Then i use this transaction in the SqlBulkCopy this way :

using (SqlBulkCopy sbc = new SqlBulkCopy(myADONETConnection,SqlBulkCopyOptions.Default,tran))

Pass the transaction object to an SSIS variable :

Dts.Variables["User::transaction_object"].Value = tran;

Then on my two block at the end Commit transaction and Rolloback transaction i use SSIS script, read the variable and either commit or rollback the transaction:

SqlTransaction tran = (SqlTransaction)Dts.Variables["User::transaction_object"].Value;
tran.Commit();

As a result if a file cannot be move to the Archive folder i don't get load twice,a transaction is fire for each file so if a file can't be more only the data about this file get rollback and the enumerator keep on going to the next one.