3

I have an insert method that works fine with smaller data sets, but once the data sets get beyond a certain size , the method times out, regardless of what i set in the initial transaction scope.

the insert methods in the code use simple stored sql procedures that are called with "ExecuteNonQuery"

My code looks something like this (fluff removed):

     public void method()
     {

     using (TransactionScope testScope = new TransactionScope(TransactionScopeOption.Required, new System.TimeSpan(1, 25, 0)))
                    {

                    timeDB.insert(var.time);

                    codeDB.insert(var.code);

                    foreach (variable var in listOfVariables)
                            {

                            nameDB.insert(var.value);

                            }

                    testScope.Complete();
                    }   
    }

I assume the issue has something to do with the command time out rather than the scope time out, since the code works fine for the smaller Datasets.... if that's the case is there any way to change the command time out without modify machine.config (or any other .config files, I'm strictly banned from modifying them as changing the values would mess up the rest of the program)

If the issue ISN'T the specific command time outs, what is causing it? Like i said the code works fine with anywhere between 8000-15000 inserts (depending on the timeout set in our ConnectionString "Connect Timeout value set in our program config file that i can only modify for test purposes but cannot change any value within), but anything larger than that and it crashes.

also i'm sure it's definitely not the scope timeout value as the task timesout after about a few minutes, no where near the 1 hour and 25 minutes set in the transactionscope

Thank you in advance for taking the time to help me fix this issue.

Edit:

By request i've added an example of one of the insert methods that is eventually called by the nameDB.insert method (Fluff removed, note that the actual insert method is actually a sub method of the NameDB.insert method, and the actual loop is in that nameDB.insert method . I just used NameDB....etc to lessen the fluff. )

  public Int32 Insert(Hashtable serData, DataDO data)
        {
            int numAffected = 0;

            IDataParameter[] parameters = 
                    {                       
                        this.Helper.GetParameter("@Text", data.Text),
                        this.Helper.GetParameter("@CreationDt", ((data.CreationDate == DateTime.MinValue) 
                        ? Convert.DBNull : data.CreationDate)),                     
                        this.Helper.GetParameter("@TypeId", data.TypeId),                       
                        this.Helper.GetParameter("@KeyId", DbType.Int32, 4, ParameterDirection.Output)
                    };

            numAffected = this.Helper.ExecuteNonQuery(this.ConnectionString, "InsertData", parameters);

            if (numAffected > 0)
                return Convert.ToInt32(parameters[parameters.Length - 1].Value);

            return 0;
        }
A.D
  • 31
  • 1
  • 5
  • As Oana said, you want the CommandTimeout property of the DbCommand object. Can you provide more info on what timeDB and codeDB are in your example? – The other other Alan May 08 '13 at 19:58
  • If your transaction is going to DTC, then note that DTC has a very short maximum timeout - for a very good reason: you should *not* be holding transactions open very long - especially "serializable" ones. – Marc Gravell May 08 '13 at 20:04
  • I've edited the post to add the insert method since it was too long for the comments. Thank you for your help. – A.D May 08 '13 at 20:04

2 Answers2

0

I think you can set the command timeout in the code also: DBCommand, if you are using a command. And the DbCommand does not inherit from the transaction scope or connection string , so you have to set it up manually.

Oana Marina
  • 277
  • 3
  • 19
0

Transactions should not to be running for a long time! This is by design, in the underlying platforms such as SQL Server. The timeouts are like safety valves; opening them too much could bring down the system.

The reason the Inserts take so much time is because you execute the query for each row. This means for each row:

  1. The Client has to send the command to SQL server
  2. SQL Server executes it
  3. SQL Server returns the result to the Client

If the SQL connection is over the network the delay per loop is even longer.

The recommended solution is to send the inserts in batch. There are many ways to do it, and depending on the amount of time you can invest in refactoring, it can be from a simple dynamic SQL string builder, SQLBulkLoad or use an Object Relational Mapping (ORM) tool such as Entity Framework (EF) or nHibernate that supports batch updates.

Using an ORM, your code would change to something like this:

public void method()
{
     using(MyDatabase db = new MyDatabase())
     {
         timeDB.insert(var.time);
         codeDB.insert(var.code);

         foreach (variable var in listOfVariables)
         {
             nameDB.insert(var.value);
         }

         using (TransactionScope testScope = new TransactionScope())
         {
              db.SaveChanges();
              testScope.Complete();      
         }
    }
}
Michael
  • 83
  • 4
  • Hello Michael, Thank you for your response. I tried your suggestions but it still timed out. Will let you know if i find out anything. Thank you! – A.D May 16 '13 at 15:36
  • Create the SQL Script, wrap them in one big transaction and run it directly in SQL Query Analyser (or the corresponding tool for the database you are using). Does it still timeout? If not, how long does it take to complete? – Michael May 18 '13 at 18:28