5

I'm reading the Excel sheet and applying My business logic and i'm trying to insert using Linq to SQL.

In my loop i have > (greater than) 5,000 records and < (Less than) 15,000 records to Insert.

public List<tblLog> tblLogList = new List<tblLog>(); 

This Method is Inside the Looping:

public void SaveLog()
{
    tblLog tnlog = new tblLog();
    tnlog.id = Guid.NewGuid();
    tnlog.comp_id = Comp_id;
    tnlog.branch_id = Branch_id;
    tnlog.row_id = rowID;
    tnlog.his_id = his_id;
    
    //Add records to List
    tblLogList.Add(tnlog);

Earlier i have tried this code to submit 1 by one:

    //_trContext.tblLogs.InsertOnSubmit(tblLog);
    //_trContext.SubmitChanges();

Due to the Performance hit i have changed InsertOnSubmit to InsertAllOnSubmit

    if (tblLogList.Count >= 1000)
    {
         _trContext.tblLogs.InsertAllOnSubmit(tblLogList);
         _trContext.SubmitChanges();
         tblLogList.Clear();
    }
}

Here my question is:

  1. What is the Maximum Number Records i can insert through InserAllOnSubmit() In Linq to Sql.

  2. Through my above code i achieved up to 1000 records but i swear while the code goes for 10,000 or more records it might through some Timeout Exception since its implemented in my **windows service**.

    I'm realy confused, what were the best suggestions to handle the above logic.?

Thanks in Advance.

Community
  • 1
  • 1
RajeshKdev
  • 6,365
  • 6
  • 58
  • 80

3 Answers3

5

While the others correctly state that there are better solutions for this task, a direct answer to your question has not yet been given.

The answer is that technically the number is either limited by the amount of memory the context will consume by swallowing all these new objects, or by the maximum size of the database transaction log. The former is far more likely to be the bottleneck, but you can prevent reaching both limits by committing the records in chunks of several hundreds or thousands and using a new context (!) for each batch.

As you said, a practical limit can be the command time-out (if you don't want to make it eternal). This too can be circumvented by committing in batches and, thus, starting a new command for each batch.

The functional limit depends on things like whether or not the process is unattended (patience), and whether the data are expected to be available within a certain amount of time. If these requirements are tight you may have to resort to one of the solutions suggested in the other answers.

Gert Arnold
  • 105,341
  • 31
  • 202
  • 291
3

At that volume, I'd be using SqlBulkCopy, perhaps via a DataTable, or perhaps via FastMember's ObjectReader API which creates an IDataReader from a list of types objects, suitable for feeding to SqlBulkCopy.

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • Is there any reference links and tutorials to use the SqlBulkCopy. Sorry, since i'm not aware of that. – RajeshKdev Jun 20 '13 at 16:09
  • @RJK I typed SqlBulkCopy into Google, and the first link is (correctly) MSDN - which has plenty of explanation and documentation.http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx – Marc Gravell Jun 20 '13 at 16:14
  • 1
    @RJK likewise FastMember has an example of this on the front page: https://code.google.com/p/fast-member/ – Marc Gravell Jun 20 '13 at 16:15
  • +1 & Thanks for the Quick response and valuable suggestions. I got an Idea. – RajeshKdev Jun 20 '13 at 16:20
2

You're using the wrong tool for the job. LINQ-to-SQL, and most other ORMs, are for OLTP. You're not doing OLTP here, you're doing a "bulk" insert ETL. Use the right tool, like SqlBulkCopy, or a dedicated framework.

jason
  • 236,483
  • 35
  • 423
  • 525