0

I had originally built out a set of SQL files where the main SQL file calls another SQL file with by passing variables and using xp_cmdshell to run a PowerShell command for the Invoke-Sqlcmd cmdlet. Got everything working nice and neat (here's the link), but then tried to take it "up a notch" by starting the process with a PowerShell script to call the main SQL file via a set of variables, which then calls the relevant "support" SQL file with a set of variables. (PS => SQL => SQL)

All working fine on the dev machine, but no worky at customer server due to needing a proxy account to run PS that calls xp_cmdshell. Not a big issue, but definitely starts adding more "baggage" to the deployment. And of course, I would like to move completely away from xp_cmdshell, which is why I was trying to call the system of SQL files from PS.

There is quite a bit of query logic in the main SQL file, so converting from SQL to PS is quite daunting, but a possibility. I also tried using 'sp_executesql' to call the "support" SQL files, but the output from BulkColumn via OPENROWSET only returns the text output, and the parameters do not actually process within the SQL file like when called via 'Invoke-Sqlcmd' using variables. Use SQL OpenRowSet BulkColumn to Insert data from .txt File

If I cannot find another way to do this, I could save the "support" SQL files as stored procedures to prevent the extra SQL file calls from the "main" SQL, which should work, but I was hoping not to introduce additional objects into a customer's on-premise database.

Any ideas on this, or has the "mad scientist" approach gotten a little too mad...

Tom
  • 31
  • 2
  • Can you provide some very basic code samples of what you are attempting? – paneerakbari Jun 23 '22 at 19:12
  • 3
    T-SQL is a terrible language to do general programming in, or even just batch scripting. Repent of this madness before it consumes you and any of your future maintainers in a pit of eternal despair! ...well maybe that's a bit too dramatic but basically, whenever you find yourself using `xp_cmdshell` you should *seriously* reconsider using something like scheduled tasks (on the OS end). Also, `Invoke-Sqlcmd` is a poor re-implementation of `sqlcmd` and lacks support for properly parameterized statements -- `sqlcmd` variables *suck*. Consider using `SqlConnection`/`SqlCommand` directly. – Jeroen Mostert Jun 23 '22 at 19:15
  • 1
    All the stuff you're trying to do looks like it could be implemented proficiently through a handful of stored procedures (rather than individual script files) that are called "sanely" -- either from a job or a scheduled task-driven script. I'll also just [leave this here](https://stackoverflow.com/a/50582924/4137916) in case it's of use. – Jeroen Mostert Jun 23 '22 at 19:18
  • *If* you are using `sqlcmd` then it does support sqlcmd commands; so you *could* use those to run the contents of the second SQL file within the first one. I, don't *like* the idea though. See [sqlcmd Commands](https://learn.microsoft.com/en-us/sql/tools/sqlcmd-utility?view=sql-server-ver16#sqlcmd-commands) (specifically the `:r` command). – Thom A Jun 23 '22 at 19:19
  • The PS file uses 'Invoke-Sqlcmd' to get the execution output from the "main" SQL file, which in turn handles processing of the "support" SQL file, which dynamically builds the query and outputs the result. And yes, the code would be helpful. The "support" SQL files use ELSE IF logic to determine which dynamically generated (parameterized) query runs based on the request from the "main" SQL file. The "main" SQL file determines the overall parameters of the data requested (and passed to the "support" SQL file) based on input parameters from the PS file. – Tom Jun 23 '22 at 20:21
  • Just to recap on technologies in each file: PS (Invoke-Sqlcmd) => SQL (xp_cmdshell) => SQL (Exec) – Tom Jun 23 '22 at 20:23
  • The main reason I am using 'Invoke-Sqlcmd' via PowerShell is to get the nicest and cleanest formatting of CSVs with file headers. None of the other technologies handled this as well (BCP, Sqlcmd), although I have not tried using SqlConnection/SqlCmd directly. – Tom Jun 23 '22 at 20:31
  • Try DBA-Tools in Powershell: you can do something like `Invoke-DbaQuery ... | Out-Csv ....`. I concur that T-SQL is absolutely the wrong place to do most of what you are trying to do. Without further details it's hard to advise – Charlieface Jun 24 '22 at 01:49
  • Agreed on DBA-Tools, but unfortunately it would need to be installed. I am thinking of pushing the "support" SQL file (and maybe even the "main") to a stored procedure, which then would limit the extra SQL file call. I was just hoping to keep this as a portable file-based solution (copy the files, set the config, and run). – Tom Jun 24 '22 at 22:31

1 Answers1

2

Call both SQL files from Powershell, pass the data in connection specific temporary table(s) instead of variables - tables with names starting with #. Create them in the usual way, with create table. They live as long as the current connection lives, so no need to destroy explicitly, although it won't hurt. When creating, though, do a "drop if exists" first, just in case connection pooling gives you a connection with the temp table already in it.

This way, there is no need to execute SQL files from SQL. That's not a well supported scenario to begin with.

Alternatively, concatenate both SQL files within Powershell into a third one, and execute that.

Seva Alekseyev
  • 59,826
  • 25
  • 160
  • 281