1

We have a poorly-performing query in L2S. The actual query looks something like this:

SELECT *
FROM Table
WHERE 
    ([ColA] IN (@p0) AND ColB = @p1) 
    OR ColB IN (@p2, @p3, @p4, @p5)
-- @p0: Input Int (Size = 0; Prec = 0; Scale = 0) [220]
-- @p1: Input VarChar (Size = 14; Prec = 0; Scale = 0) [CountryDefault]
-- @p2: Input NVarChar (Size = 0; Prec = 0; Scale = 0) []
-- @p3: Input NVarChar (Size = 0; Prec = 0; Scale = 0) []
-- @p4: Input NVarChar (Size = 7; Prec = 0; Scale = 0) [WF1 1XU]
-- @p5: Input NVarChar (Size = 3; Prec = 0; Scale = 0) [WF1]
-- Context: ProfiledSqlProvider(Sql2008) Model: AttributedMetaModel Build: 3.5.30729.1

One problem is a merge-sort as a result of the OR, but we're looking into that.

The other problem is more interesting. One of our DBAs pointed out there's quite a performance hit in the codepage conversion between the NVarChar parameters and the column they're being compare with VarChar.

The weird thing is how L2S chose for some of those parameters to be NVarChar, and the p1 parameter to be VarChar. In the code they are all strings, although p1 is a string literal while the others are variables. Notice they're all being compared to the same column (ColB) so it's nothing to do with the column datatype.

How does L2s decide what datatype to use when generating the query, based on the values passed-in? If possible, how do I control that?

Sedat Kapanoglu
  • 46,641
  • 25
  • 114
  • 148
Neil Barnwell
  • 41,080
  • 29
  • 148
  • 220

0 Answers0