5

We are using SQL bulk copy to insert nearly 10 million rows into a SQL Server table.

We received this exception after 7400000 insertions:

OLE DB provider 'STREAM' for linked server '(null)' returned invalid data for column '[!BulkInsert].Amount'.

Please let us know if this can be resolved.

There is also memory leak problem.

Below is our code :

try
{
            using (SqlConnection SQlConn = new SqlConnection(Common.SQLConnectionString))
            {                   
                DataTable dt1 = FillEmptyDateFields(dtDestination);

                //SqlTableCreator ObjTbl = new SqlTableCreator(SQlConn);

                //ObjTbl.DestinationTableName = DestinationTable;

                using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null))
                {

                    //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn);
                    bulkCopy.DestinationTableName = DestinationTable;
                    bulkCopy.BulkCopyTimeout = 0;
                    bulkCopy.BatchSize = dt1.Rows.Count; // Batch Size Added.
                    Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info);
                    if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken)
                        SQlConn.Open();
                    bulkCopy.WriteToServer(dt1);
                    SQlConn.Close();
                    SQlConn.Dispose();
                    bulkCopy.Close();
                    if (bulkCopy != null)
                    {
                        ((IDisposable)bulkCopy).Dispose();
                    }                        
                }
                dt1.Dispose(); 
                dt1 = null;                           
            }

            dtDestination.Dispose();
            System.GC.Collect();
            dtDestination = null;                

}
catch (Exception ex)
{
    Logger.Log(ex, Logger.LogType.Error);
    throw ex;
}
Lex Li
  • 60,503
  • 9
  • 116
  • 147
User87
  • 91
  • 1
  • 3
  • insert in batches of fewer records – techBeginner Oct 11 '12 at 05:35
  • I doubt the exception was caused my memory leak, I would instead place my bet on locating the offending row (it really seems there is one). Like previous comments suggest, you should decrease you BatchSize to a much lower value, perhaps even to 100. This way, when exception occurs, you will know you offending row is (Exception +- 100) rows located. I would split source data from (ExceptionPoint-100) until the end and resumed my importing from splited data. When everything is fine, you may return to your current aproach. Hope this helps :D – Julio Nobre Oct 12 '12 at 09:27
  • As reinforcemente of previous comment, take a look at Nick Tompson's answer to his own question at http://social.msdn.microsoft.com/Forums/en-US/adodotnetdataproviders/thread/d7d3d2aa-c5b6-4a96-9d17-354e498c487c?prof=required – Julio Nobre Oct 12 '12 at 11:05
  • This (http://stackoverflow.com/questions/578235/ssis-fuzzy-lookup-with-multiple-outputs-per-lookup-error) might provide you a more direct insight over the solution, since it faster to check – Julio Nobre Oct 12 '12 at 11:10

2 Answers2

0

Based on your code it looks like you're setting the BatchSize to the size of the datatable, presumably 10 million rows. You may want to try using a value of something like 5000.

John
  • 51
  • 1
  • 9
0

Maybe because you're trying to insert a NaN value into a float field?

I had this problem recently, and to figure out what was wrong, I just inserted everything into a table where the data type was NVARCHAR(MAX) and then I noticed that one value was NaN.