3

Given:

CREATE PROCEDURE [dbo].[my_storedproc]
  @param1 int, @param2 varchar(100)
AS 
<<whatever>>
GO

Are there known performance differences between these different execution methods?:

-- Method #1:
declare @param1 int = 1
declare @param2 varchar(100) = 'hello'  
exec my_storedproc @param1, @param2

-- Method #2:  
exec my_storedproc @param1=1, @param2='hello'

-- Method #3:  
declare @param1 int = 1
declare @param2 varchar(100) = 'hello'  
declare @procname nvarchar(100) = N'my_storedproc @param1, @param2'
declare @params nvarchar(4000) = N'@param1 int, @param2 varchar(100)'  
exec sp_executesql @procname, @params, @param1, @param2

-- Method #4:  
declare @procname nvarchar(4000) = N'my_storedproc @param1=1, @param2=''hello'''
exec sp_executesql @procname

-- Method #5:  
declare @procname nvarchar(4000) = N'my_storedproc 1, ''hello'''
exec sp_executesql @procname

-- Method #6:  
declare @procname nvarchar(4000) = N'my_storedproc 1, ''hello'''
exec (@procname)

"Why do you ask?" you ask? I am trying to find a way to generically execute stored procedures entirely based upon metadata, the controlling stored procedure that will physically execute all the other configured (in metadata) stored procedures knows nothing about them other than what is defined in the metadata. Within this controller SP, I cannot (in any practical sense) know and declare the specific physical parameters (with their required data types) required for every possible stored proc that might have to be called - I am trying to find a way to execute them entirely generically, while still hopefully maintaining decent performance (reusing query plans, etc).

Solomon Rutzky
  • 46,688
  • 9
  • 128
  • 171
tbone
  • 5,715
  • 20
  • 87
  • 134
  • I don't know why so many people think this is good idea. What you are describing is a single stored procedure that can execute any other stored procedure. This is like creating a single method in .NET that can do anything. Sure it can be done, but the cost is going to be performance. Every time you need to do anything with a stored procedure it will first have to parse through a bunch of stuff to figure out what to actually do. – Sean Lange Mar 05 '15 at 19:44
  • 1
    It's interesting how SQL people literally can't conceive of the underlying principle here - not only is this not crazy, it is the very principle that ORM tools are based upon. "This is like creating a single method in .NET that can do anything" No it isn't, you still have individual stored procs. " the cost is going to be performance." Is it? That's the question. That's what detractors said about dynamic SQL with ORM's, and they were proven to be incorrect. – tbone Mar 05 '15 at 19:54
  • 3
    Right but an ORM generates the sql dynamically. And I have not seen an ORM that produces great sql yet. It is incredibly complicated which is why the sql is always so unmanageable. I think the sql coming out of something with this additional layer of abstraction would be even more crazy. I would be very leery of using method 3 or 4 that you described because it would be difficult to prevent sql injection without using parameters. And method 1 isn't going to be generic enough for what you are trying to do. – Sean Lange Mar 05 '15 at 20:02
  • As far as I can tell, 1 and 2 are identical in that only the outer query must be compiled (assuming the procedure has a cached plan), 3, 4 and 5 are also identical in that you must compile two lots of SQL, the outer and inner queries (for want of a better term), probably not significant detriment in individual queries, but you may get a bloated plan cache. In addition 3 and 4 look very difficult to make type safe. The 6th method will fail, as you are trying to execute a stored procedure called `[my_storedproc 1, ''hello'']`. – GarethD Mar 05 '15 at 20:07
  • If 6 should be `exec (N'my_storedproc 1, ''hello''')` then it is just the same as 5. How do you intend on calling the generic stored procedure that calls all other procedures? Why are you trying to reinvent the wheel, if you want ORM like behaviour jthen why not ust use an ORM? – GarethD Mar 05 '15 at 20:10
  • Method 4 is BY FAR the easiest for me to implement, but concerns are performance, and SQL injection. This is an entirely internal system though, appropriate permissions will be set, and and I will be doing some rudimentary checking for injection, so not overly concerned. I'm somewhat more worried about performance with heavier queries. – tbone Mar 05 '15 at 20:57
  • @GarethD This will be called on demand (proxy data server scenario) or by a scheduled process (off hours data extraction scenario) - it's not an ORM, it is just based on the same principles. – tbone Mar 05 '15 at 21:01
  • most people use SSIS for nightly SQL jobs. Is that not an option here? Will the sprocs you call really vary that much day to day that you can't set something like this up? – Jeremy Mar 11 '15 at 18:53
  • 1
    I suggest if you are going to do something that has the potential to affect the entire system performance that you do some in depth research first. You should not attempt to know this until you have read a book on the SQL server internals, and one on SQL server performance tuning and one on how to read execution plans. You need to understand how it creates the execution plan and the implications of that in the way you are intending to do business. It make a difference which SQL Server backend you are using too as they have changed how these things are interpreted from version to version. – HLGEM Mar 11 '15 at 19:52
  • To get you started: http://www.amazon.com/Microsoft-Server-Internals-Developer-Reference/dp/0735658560/ref=sr_1_1?s=books&ie=UTF8&qid=1426103603&sr=1-1&keywords=sql+server+internals – HLGEM Mar 11 '15 at 19:54
  • There are different versions of the book above for different versions of SQL Server , but alawys look for the one written by Karen Delaney. – HLGEM Mar 11 '15 at 19:55
  • @Jeremy The design of this will be such that relatively low skilled admin staff can register new data sources, avoiding the requirement for sophisticated skills like an SSIS professional. – tbone Mar 12 '15 at 00:29

1 Answers1

1

There really shouldn't be a performance difference between the 6 options since they are all executing the stored procedure and not any SQL statements directly.

However, there is no better indication of performance than testing this on your own system. You already have the 6 test cases so it shouldn't be hard to try each one.

Within this controller SP, I cannot (in any practical sense) know and declare the specific physical parameters (with their required data types) required for every possible stored proc that might have to be called

Why not? I don't see why you couldn't dynamically generate the SQL for Methods 2 and 3 based on the output of either of the following queries:

SELECT OBJECT_NAME(sp.[object_id]), *
FROM   sys.parameters sp
WHERE  sp.[object_id] = OBJECT_ID(N'dbo.my_storedproc');

SELECT isp.*
FROM   INFORMATION_SCHEMA.PARAMETERS isp
WHERE  isp.[SPECIFIC_NAME] = N'my_storedproc'
AND    isp.[SPECIFIC_SCHEMA] = N'dbo';

And with that info, you could create a table to contain the various parameter values for each parameter for each proc. In fact, you could even set it up to have some parameters with "global" values for all variations and then some parameter values are variations for a particular proc.

Solomon Rutzky
  • 46,688
  • 9
  • 128
  • 171
  • All the SQL is stored as metadata, it is a mixture of plain SQL and stored procedure calls, and some of it references remote SQL servers. – tbone Mar 12 '15 at 00:26
  • @tbone The question text and examples only refer to stored procs being called by this setup. If the setup will also be calling dynamic SQL then there can be performance differences between some of those 6 options. And then of course even if there are parameters, those won't be discoverable via the two queries I listed. – Solomon Rutzky Mar 15 '15 at 14:31
  • I think my concerns behind this problem (performance of dynamic sql including parameters via sp_executesql) might be eliminated by moving this to SQLCLR. You make a good point in your two query examples, but I also have to consider non-SP based query text. With properly formed queries via the SQLCLR though, I would now expect the performance characteristics of what I'm doing to be essentially the same as any standard ORM (which most DBA's seem to hate, but I don't think there's much justification for that provided they're used correctly.) – tbone Mar 16 '15 at 15:57
  • 1
    @tbone That is not how this stuff actually works. An execution plan is tied to the full text of the query (using a binary comparison: _everything_-sensitive). Non-parameterized SQL (`EXEC`, and `SqlCommand` with no parameters) will generate a new plan for any change in any parameter value since that changes the query text. Parameterized SQL (`sp_executesql` and `SqlCommand` with params) can reuse a plan since changing param values doesn't change the query text. SQLCLR doesn't change this. ORM's usually use `sp_executesql` but don't allow for fine-tuning the queries, hence why they're disliked. – Solomon Rutzky Mar 17 '15 at 19:26
  • I think as long as I render my dynamic sql as *parameterized queries*, I should get plan reuse (at least from what I read that is the case) in many cases. BUT, I think there are some quite strict guidelines you have to follow. I think in my case this is premature optimization, but I'd like to have a handle on it. – tbone Mar 18 '15 at 20:38
  • @tbone Yes, what I said in my previous comment was: "Parameterized SQL (`sp_executesql` with params and `SqlCommand` with params via `SqlParamater`) can reuse a plan since changing param values doesn't change the query text". I am not sure what other guidelines you think need to be in place. But there is no difference between using `sp_executesql` with params, using an ORM (so long as it uses `sp_executesql` with params), or using SQLCLR--which will be a `SqlCommand`--with `SqlParameter`s); these are all parameterized queries. – Solomon Rutzky Mar 20 '15 at 02:13