I have a database table with 3 columns, all strings. I must upload 200,000 rows of information. I can do this but as you would expect it is taking much too long, like 30+ minutes.
I am now trying to use SQLBulkCopy to do it faster but I cannot understand how to do it correctly. A sample of a row of data would be:
"test string","test string,"test string"
Should I write my data to a temp file so SQLBulkCopy can use it to upload the data? Like have each line represent a row and deliminate the data by comma's?
Here is what I have so far, any help would be great!
//this method gets a list of the data objects passed in
List<string[]> listOfRows = new List<string[]>(); // holds all the rows to be bulk copied to the data table in th database
foreach (DataUnit dataUnit in dataUnitList)
{
string[] row = new string[2];
row[0] = dataUnit.value1.ToString();
row[1] = dataUnit.value2.ToString();
row[2] = dataUnit.value3.ToString() ;
listOfRows.Add(row);
}
File.Create(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt"));
using (System.IO.StreamWriter file = new System.IO.StreamWriter(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
{
foreach(string[] array in listOfRows)
{
file.Write(array[0] + "," + array[1] + "," + array[2]);
}
}
using (MyFileDataReader reader = new MyFileDataReader(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
{
SqlBulkCopy bulkCopy = new SqlBulkCopy("my connection string to database");
bulkCopy.DestinationTableName = "my table name";
bulkCopy.BatchSize = 20000; //what if its less than 200,000?
bulkCopy.WriteToServer(reader);
bulkCopy.Close();
}