0

Using SqlDataAdapter and SqlCommandBuilder the created update statements for sql server are inefficient as explained below.

Here is sample code for reproduction: Sql Server:

Create database TestDB;
GO
USE [TestDB]
CREATE TABLE [dbo].[test](
    [i] [int] NOT NULL,
    [v] [varchar](50) NULL,
    [c] [char](10) NULL,
 CONSTRAINT [pk1] PRIMARY KEY CLUSTERED ([i] ASC) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
insert into dbo.Test (i,v,c) values (10,'A','B');
GO

c# console application demo code:

using System;
using System.Data.SqlClient;
using System.Data;

namespace CmdBuildTest
{
    class Program
    {
        static void Main(string[] args)
        {
            SqlConnection cn = new SqlConnection();
            cn.ConnectionString = @"Server=localhost\sql2016;Database=testDB;Trusted_Connection=True;";             
            SqlDataAdapter da = new SqlDataAdapter("Select * From dbo.test", cn);

            //da.FillSchema(ds, SchemaType.Mapped);         
            SqlCommandBuilder cb = new SqlCommandBuilder(da);
            cb.ConflictOption = ConflictOption.OverwriteChanges;
            //cb.RefreshSchema();
            DataSet ds = new DataSet();
            da.Fill(ds);
            ds.Tables[0].Rows[0]["v"] = DateTime.UtcNow.Millisecond.ToString();
            SqlCommand u = cb.GetUpdateCommand(true);
            Console.WriteLine("Update Command: " + u.CommandText);

            foreach (SqlParameter par in u.Parameters)
            {
                Console.WriteLine(" Name=" +  par.ParameterName +  "|Type=" + par.SqlDbType.ToString() + " |Size=" + par.Size.ToString());
            }           
            da.UpdateCommand = u;  //I am not sure if this is required, but I am careful.               
            //Execute Changes / Update Statement :
            da.Update(ds);
            Console.ReadLine();
            //Sample result in Profiler:
            /*
                exec sp_executesql N'UPDATE [dbo].[test] SET [v] = @v WHERE (([i] = @Original_i))',N'@v varchar(3),@Original_i int',@v = '603',@Original_i = 1
            */
        }

    }
}

The Console.WriteLine shows the following SQL UPDATE statement created:

UPDATE [dbo].[test] SET [i] = @i, [v] = @v, [c] = @c WHERE (([i] = @Original_i))

In Sql Profiler the following query is getting cought:

exec sp_executesql N'UPDATE [dbo].[test] SET [v] = @v 
WHERE (([i] = @Original_i))',N'@v varchar(3),@Original_i int',@v='708',@Original_i=1

Now as you can see, the Parameter @v is defined as VARCHAR(3) while in the original table dbo.test the column V is defined as VARCHAR(50).

The VARCHAR(3) is passed because the value 708 has 3 digits. If you would pass a constant string 5 characters long, the parameter size would be passed in as VARCHAR(5). The behaviour is by design explained here: SqlParameter.Size Property as follows:

"If not explicitly set, the size is inferred from the actual size of the specified parameter value."

I am actually looking for a way to prevent that. Because of this every combination of variable length data types passed as parameter enforces a single execution plan generated and this leads to thousands of similar execution plans in Sql Server that are getting compiled consuming CPU time and blocking a lot of RAM to be cached an never re-used.

How can this behaviour be influenced without completely redesigning this application's core-code? What I would like to do here is to get not only the TYPES from the original table using the CommandBuilder but also the SIZE, but it seems impossible to get this information.

Magier
  • 437
  • 1
  • 6
  • 18
  • Using the `FillSchema` method of the SqlDataAdapter will provide you with the column sizes, but in my testing this doesn't help. Even when mapping those `MaxLength` values back to the update command's parameters, the query plan still shows that the length for the `varchar` column is inferred. – wablab Feb 14 '17 at 18:45
  • Yes, exactly that is my finding too, I also went this approach meanwhile, but the UPDATE method of the dataadapter "destroys" everything that has been prepared in advance... – Magier Feb 14 '17 at 18:54

1 Answers1

1

I think I found the answer, so I'll post it here (the extra space for formatted code will help). The answer appears to be a combination of FillSchema and handling the DataAdapter's RowUpdating event. So, as we briefly discussed in the comments above, use da.FillSchema(ds, SchemaType.Source); to obtain the column sizes. Then, add a handler for the DataAdapter's RowUpdating event, and set the column sizes on the update command's parameters there. Something like this:

EDIT: Including a more complete code example for reference:

public static void Main(string[] args)
{
    var cn = new SqlConnection("Data Source=.; Initial Catalog=TestDB; Integrated Security=SSPI");
    var da = new SqlDataAdapter("SELECT * FROM dbo.test", cn);
    var cb = new SqlCommandBuilder(da);
    cb.ConflictOption = System.Data.ConflictOption.OverwriteChanges;

    var ds = new DataSet();
    da.Fill(ds);
    da.FillSchema(ds, SchemaType.Source);

    ds.Tables[0].Rows[0]["v"] = DateTime.UtcNow.Millisecond.ToString();
    da.RowUpdating += new SqlRowUpdatingEventHandler(da_RowUpdating);
    da.Update(ds);
}

static void da_RowUpdating(object sender, SqlRowUpdatingEventArgs e)
{
    foreach (var p in e.Command.Parameters.Cast<SqlParameter>())
    {
        p.Size = e.Row.Table.Columns[p.SourceColumn].MaxLength;
    }
}

Here's a screenshot of what I see in profiler: enter image description here

wablab
  • 1,703
  • 13
  • 15
  • I will try this ob thursday and give you Feedback. Thank you. – Magier Feb 14 '17 at 21:48
  • I tried your suggestion. It basically does exactly the same as I also already tried using par.Size = ds.Tables[0].Columns[par.SourceColumn].MaxLength; in the main method. This, asl well ya your solution succesfully adjusts all the parameters sizes. But unfortunately, the dataadapter.Update method still changes all of that an it ends in the undesired result of flexible, concent-depending datatype sizes in the final sp_executesql command executed in SQL Server. So, unfortunately, it does not solve the problem. – Magier Feb 16 '17 at 11:47
  • Strange. It does appear to be working on my end. I've updated the code in my answer above to provide a more complete reference of what I'm doing. Maybe you can spot something that we're doing differently, or perhaps someone else can try to repro the results. – wablab Feb 16 '17 at 14:40
  • Yes, the main difference between your and mine is that I forgot the following line: da.RowUpdating += new SqlRowUpdatingEventHandler(OnRowUpdating); I did not notice because the breakpoints inside of thiseventhandler still stopped/worked, but in the end the type sizes differ depending of the line above. Without this line it is varchar(len of value), with it it is varchar(maxlenofcolumn). Weird, I don't understand why, but after all your answer is correct! – Magier Feb 17 '17 at 09:49