20

We already have a running system that handles all connection-strings (db2, oracle, MSServer).

Currently, We are using ExecuteNonQuery() to do some inserts.

We want to improve the performance, by using SqlBulkCopy() instead of ExecuteNonQuery(). We have some clients that have more than 50 million records.

We don't want to use SSIS, because our system supports multiple databases.

I created a sample project to test the performance of SqlBulkCopy(). I created a simple read and insert function for MSServer

Here's the small function:

public void insertIntoSQLServer()
{
    using (SqlConnection SourceConnection = new SqlConnection(_sourceConnectionString))
    {
        //Open the connection to get the data from the source table
        SourceConnection.Open();
        using (SqlCommand command = new SqlCommand("select * from " + _sourceSchemaName + "." + _sourceTableName + ";", SourceConnection))
        {
            //Read from the source table
            command.CommandTimeout = 2400;
            SqlDataReader reader = command.ExecuteReader();

            using (SqlConnection DestinationConnection = new SqlConnection(_destinationConnectionString))
            {
                DestinationConnection.Open();
                //Clean the destination table
                new SqlCommand("delete from " + _destinationSchemaName + "." + _destinationTableName + ";", DestinationConnection).ExecuteNonQuery();

                using (SqlBulkCopy bc = new SqlBulkCopy(DestinationConnection))
                {
                    bc.DestinationTableName = string.Format("[{0}].[{1}]", _destinationSchemaName, _destinationTableName);
                    bc.NotifyAfter = 10000;
                    //bc.SqlRowsCopied += bc_SqlRowsCopied;
                    bc.WriteToServer(reader);
                }
            }
        }
    }
}

When I have less that 200 000 in my dummyTable the bulk copy is working fine. But, when it's over 200 000 records, I have the following errors:

  • Attempt to invoke bulk copy on an object that has a pending operation.

OR

  • The wait operation timed out (for the IDataReader)

I increased the CommandTimeout for the reader. It seems that it has solved the timeout issue related to IDataReader.

Am I doing something wrong in the code?

billybob
  • 2,859
  • 6
  • 35
  • 55
  • 1
    Never ever SqlBulkCopy to the target table. That thing has seriously broken locking code. Especially when using multi threading. Create a temp table, insert into that, then copy to target table. – TomTom Dec 15 '14 at 16:00
  • I'm not using multi threading. I'm always inserting into an empty table. – billybob Dec 15 '14 at 16:05
  • 2
    Why sqlbulkcopy at all? Seriously. Tables on he same database - just tell the SERVER to copy the data instead of pulling it to your program just to upload it. Select right into the target table with one statement. – TomTom Dec 15 '14 at 16:07
  • 1
    They are not going to be on the same server... I will read from multiple database like db2, oracle, sql server, mysql. Rigth now we are using it like your are saying. But since we need to support multiple databases, we are exploring new possibilities. – billybob Dec 15 '14 at 19:47

2 Answers2

28

Can you try adding the following before the call to WriteToServer ...

bc.BatchSize = 10000;
bc.BulkCopyTimeout = 0;

I don't know what the default batch size or timeout is, but I suspect this might be your issue. Hope that helps

Also, you can try playing with different Batch Sizes for optimal performance.

Spock
  • 4,700
  • 2
  • 16
  • 21
  • specify what 0 means (Infinity) when giving an example for `BulkCopyTimeout` – JJS Dec 06 '18 at 17:07
  • 2
    default BatchSize is zero (each WriteToServer is a single batch) and default BulkCopyTimeout is 30 (seconds) – Matt Kemp Mar 04 '19 at 06:31
6

You can try this

bc.BatchSize = 100000; // How many Rows you want to insert at a time
bc.BulkCopyTimeout = 60; // Time in Seconds. If you want infinite waiting Time then assign 0.
tinonetic
  • 7,751
  • 11
  • 54
  • 79
Deepesh
  • 590
  • 6
  • 8