I'm importing an Excel file (about 1000 records only) to a dedicated SQL Server database. As I need to work on the incoming data from Excel (add a GUID per row, some data conversions) I want to do it row by row and don't want to bulk import (I have nothing against transactions, though).
I'm confused on how to do it correctly. I can either use an SQLCommand
with parameters like so:
SqlCommand sqlCommand = new SqlCommand("insert into TestTable(GUID,Name,Pricing) values(@GUID,@Name,@Pricing)", sqlConn);
foreach (DataRow dr in ds.Tables[0].Rows) //<-- this is my Excel file to iterate through
{
sqlCommander.Parameters.Clear();
String refGUID = Guid.NewGuid().ToString();
sqlCommander.Parameters.AddWithValue("GUID", refGUID);
sqlCommander.Parameters.AddWithValue("Name", dr.ItemArray[0]);
sqlCommander.Parameters.AddWithValue("Pricing", dr.ItemArray[1]);
sqlCommander.ExecuteNonQuery();
}
Or I can use the "connected" mode like so:
SqlDataAdapter dataAdapter = new SqlDataAdapter("SELECT GUID, Name, Pricing FROM TestTable", sqlConn);
SqlCommandBuilder commandBuilder = new SqlCommandBuilder(dataAdapter);
DataSet myDataSet = new DataSet();
dataAdapter.Fill(myDataSet, "TestTable");
foreach (DataRow dr in ds.Tables[0].Rows) //<-- this is my Excel file to iterate through
{
DataRow row = myDataSet.Tables[0].NewRow();
row["GUID"] = refGUID;
row["Name"] = dr.ItemArray[0];
row["Pricing"] = dr.ItemArray[1];
myDataSet.Tables[0].Rows.Add(row);
dataAdapter.Update(myDataSet);
}
Now my questions are the following:
- Is it better to send an
INSERT
command for each row (that would be theSqlCommand
method) or is it better to fill a specialDataSet
(2nd Method)? I guess it is rather "stupid" to have 1000's of inserts to an SQL server? dataAdapter.Update(myDataSet)
<-- should I do that AFTER iterating through all Excel rows or for each row (as shown in the example code above), would this magically create a transaction?- Which method should I use? There also is LINQ to SQL - why not use that (performance maybe, because of another layer)?
- What happens to the
DataSet
when some error occurs when reading out the Excel file - is the update still pushed to the SQL server or is everything lost?
In short: I want to import an Excel file to an SQL server row-by-row whilst making changes on the data to be imported (and I don't want to use SSIS packages [because besides data conversion I am doing a lot more with the Excel file like importing it to Sharepoint and firing of Workflows] or BizTalk)
»» How to do that beautifully?
In the end I went ahead and bought Aspose Cells. Aspose has a very nice set of tools at their disposal.