0

In Query that haves 3 tables variables joined with 2 tables in some cases where those variable tables are empty (no rows) in the Execution Plan still appears with 16% cost for the scan of each table variable.

From searching found this rule

If a table variable is joined with other tables in SQL Server, it may result in slow performance due to inefficient query plan selection because SQL Server does not support statistics or track number of rows in a table variable while compiling a query plan

from https://sqlperformance.com/2014/06/t-sql-queries/table-variable-perf-fix

So my question is it worth it before creating the table variable and inserting rows into it, validate if there is any rows to insert in the first place and ignore the creation of the table variable and in the query where have 3 variable table ignore the joins for table variable that weren't created?

H.C
  • 565
  • 7
  • 28
  • 1
    This seems more like an "xy question". Rather than asking how to ignore the Table variable if certain criteria are met, you would be questioning *"Why am I using Table Variables? Why am I using them instead of a Temporary table, or even a static table? How can I change my query to not use them?"* You can't "ignore" a table in the `JOIN` syntax if it doesn't have any rows; the query analyser has to check. As you've seen in the article, Table variables aren't a good replacement for an actual table; so I think we'd be better trying to find a solution to using those. – Thom A Feb 19 '18 at 15:03
  • Its being used the Table variable instead of Temporary table because in this case the number of rows inserted is alls 1 row max and variable table performance is better then temp table for small data set – H.C Feb 19 '18 at 15:16
  • @H.C, I won't put much wait on the estimated cost and not try to solve a performance problem you don't have. SQL Server will estimate 1 row for table variables, which is close enough to zero as to not to worry table from a plan perspective. – Dan Guzman Feb 19 '18 at 15:22
  • 1
    Performance difference between a table variable and a temp table for a single row is so minimal it probably can't even be measured. If your execution plan is showing 16% joining to a single row table variable and performance is an issue you have much bigger issues going on. What you need to focus on is how to improve performance of the whole thing, not isolate little pieces here and there that you think might be the problem. – Sean Lange Feb 19 '18 at 15:22
  • @SeanLange . . . Perhaps the point is that the query is so fast already that scanning a table variable is 16% of the estimate. In any case, SQL Server assumes something like 1,000 rows when the size of the table is unknown. – Gordon Linoff Feb 19 '18 at 15:38
  • @GordonLinoff I think we are saying the same thing. 16% doesn't always mean it is slow if that 16% is a part of a millisecond. :) – Sean Lange Feb 19 '18 at 15:42

0 Answers0