0

I have a database table with 3 columns, all strings. I must upload 200,000 rows of information. I can do this but as you would expect it is taking much too long, like 30+ minutes.

I am now trying to use SQLBulkCopy to do it faster but I cannot understand how to do it correctly. A sample of a row of data would be:

"test string","test string,"test string"

Should I write my data to a temp file so SQLBulkCopy can use it to upload the data? Like have each line represent a row and deliminate the data by comma's?

Here is what I have so far, any help would be great!

//this method gets a list of the data objects passed in

 List<string[]> listOfRows = new List<string[]>(); // holds all the rows to be bulk copied to the data table in th database

foreach (DataUnit dataUnit in dataUnitList)
{         
 string[] row = new string[2];
 row[0] = dataUnit.value1.ToString();
 row[1] = dataUnit.value2.ToString();
 row[2] = dataUnit.value3.ToString() ;
 listOfRows.Add(row);
 }

File.Create(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt"));

using (System.IO.StreamWriter file = new System.IO.StreamWriter(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
             {

                 foreach(string[] array in listOfRows)
                 {
                     file.Write(array[0] + "," + array[1] + "," + array[2]);
                 }
             }

             using (MyFileDataReader reader = new MyFileDataReader(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
             {
                 SqlBulkCopy bulkCopy = new SqlBulkCopy("my connection string to database");
                 bulkCopy.DestinationTableName = "my table name";
                 bulkCopy.BatchSize = 20000; //what if its less than 200,000?

                 bulkCopy.WriteToServer(reader);

                 bulkCopy.Close();

             }  
Peter
  • 15
  • 1
  • 4
  • I live by the rule that if it makes my app faster, its better, if its less heavy on memory, its better, and with bulk copying, its best for me to have a csv file. SQL can use that csv and put it in a table as it is... Its faster and better. So IMO csv->slq machine->sqlbulkcopy csv... – Jonny Aug 19 '14 at 10:59
  • So just write my data to a csv file and tell sql bulk copy to convert the csv file to the table? – Peter Aug 19 '14 at 11:00
  • Go look at this. http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/ – Jonny Aug 19 '14 at 11:01

1 Answers1

3

The bulk insert T-SQL requiers that the source data is present in a file on the DB server. That's not the way you want to go if you write code.

Instead you should create an adapter around whatever data source you have. The adapter should implement parts of the IDataReader interface - SqlBulkCopy only make use of a few the methods of IDataReader, so only those need to be implemented:

Properties

  • FieldCount

Methods

  • GetName
  • GetOrdinal
  • GetValue
  • Read

(source: http://www.michaelbowersox.com/2011/12/22/using-a-custom-idatareader-to-stream-data-into-a-database/)

So unless your data already is on file, there is no need to write it to file. Just make sure that you can present a record at a time through the IDataReader interface.

Avoiding to save the data to file saves a lot of resources and time. I once implemented an import job that read XML data from an FTP data source through an XmlReader, which was then wrapped in a custom IDataReader. That way I could stream the data from the ftp server, through the app server, to the DB server without having to write the entire data set to disk on the app server.

Community
  • 1
  • 1
Anders Abel
  • 67,989
  • 17
  • 150
  • 217
  • 1
    Or, instead of wrapping the data source in a `IDataReader`, simply transfer the source data into a `DataTable`. [`SqlBulkCopy` supports `DataTable`s](http://msdn.microsoft.com/en-us/library/ex21zs8x.aspx "MSDN reference page") out-of-the-box. – stakx - no longer contributing Aug 19 '14 at 11:09