0

I have some SSIS packages that are stored in the SQL Server and exposed via a SQL Agent Job

I have written a component that allows me to dynamically build up the correct command line for the job step containing the SSIS package which allows me to override package variables using the /SET command

Unfortunately, in SMO, I need to do the following:

jobStep.Command = MyNewComandString;
jobStep.Alter();
parentJob.Start();

The problem with this approach is that it requires me to call .Alter() in order for the new command line to take effect, which updates the JobStep definition on the SQL Server Agent. If I don't call Alter() and just call Start() then it will not take effect

This is obviously a PITA because now I need to worry about 'restoring' the original Command line ...is there any way I can pass a custom command string to the JobStep, have it execute without having to call Alter() and thus be responsible for restoring the original command line?

BTW xp_cmdshell is not an option

blue18hutthutt
  • 3,191
  • 5
  • 34
  • 57
  • 3
    It's not clear from the question why this process is necessary. Might a simpler solution be to have the process write out a config file containing the package variable values so that the SQL Agent job can be called without modification? – Ed Harper Apr 16 '12 at 20:19
  • This process is to allow first-level production support the ability to start SSIS jobs from an Admin section of a web page and monitor the package execution status while having the ability to optionally override individual package variables (the equivalent of running dtexec.exe with /SET flags) – blue18hutthutt Apr 16 '12 at 20:27
  • 2
    Can't SMO pull the details of the current command value BEFORE you assign the new value? Then you can store it wherever before you apply the change. You can't commit a change in a Windows dialog without clicking OK or Apply - this is along the same lines. If you want to save the previous state, you need to retrieve it first. – Aaron Bertrand Apr 16 '12 at 20:29
  • Yes, you CAN retrieve the previous Command values before assigning the new value, however this causes a mess in ensuring that the original Command is restored because if I call .Start(), then immediately re-assign and the Command and call .Alter() with the original Command, the job won't have started yet because .Start() is asynchronous. I'm not comfortable with the idea of doing a Thread.Sleep and doing a clean-up step because if something goes wrong, in between I'll lose the original Command and it's just a hack solution anyway – blue18hutthutt Apr 16 '12 at 20:44
  • 1
    That's why I suggested you STORE it somewhere, rather than rely it to persist through your script (that isn't storing it). You could add a last step to the job which pulls the old command from wherever you store it and updates itself to its previous state. – Aaron Bertrand Apr 16 '12 at 21:01
  • If it comes down to that, that's what I might have to do - however my question is, is there a way to do what I'm trying to do with SMO without having to go to the trouble of doing this – blue18hutthutt Apr 16 '12 at 21:03
  • I don't think there is. You can't save your changes without issuing the `Alter()`. So, do you want to change the job and run it, or not? If you want the command to run to be synchronous, have you thought about just running it directly instead of worrying about this job at all? – Aaron Bertrand Apr 16 '12 at 21:08
  • I'd rather not have to change the job and run it, nor be responsible for restoring it. I can't run it directly because it's on another machine and xp_cmdshell isn't available and the SSIS packages themselves are stored inside SQL Server. To my knowledge sp_start_job doesn't have a mechanism to pass job parameters either. Seems like my only option is SMO or xp_cmdshell (not available due to DBA policy). The only other way would be to create a temp job with .DeleteLevel = CompletionAction.Always but then I'd lose the execution history for the job. – blue18hutthutt Apr 16 '12 at 21:11
  • Still think you can accomplish this by pulling the current command from a table of some kind or a config file as Ed suggested. (A table could keep history, and the job could just pick the latest one automatically.) But there isn't a magic way to change a job and then have it revert to its previous state without some other mechanism, sorry. – Aaron Bertrand Apr 16 '12 at 21:24
  • Could you go about doing this backwards? Instead of trying to alter a job in an attempt to troubleshoot a failing package, extract the relevant bits and spawn a clean process with the Agent or the proxy user's credentials that runs the package. That way you preserve your original job and have a clean sandbox to troubleshoot in. Or if not a dedicated sandbox app/thread/whatever, create a parallel job that's dedicated to futzing about. – billinkc Apr 17 '12 at 01:47
  • billinkc: not sure what you mean - I am not altering a job in an attempt to troubleshoot a failing package, I am trying to programmatically invoke an SSIS package with custom package variable values equivalent to running dtexec.exe with /SET flags. I cannot spawn a new process because the Agent resides on another machine and xp_cmdshell is disabled. Please be clear, I am not attempting to alter a job because I want to - I am forced to because this is how the SMO API behaves. – blue18hutthutt Apr 17 '12 at 06:24
  • Aaron, I am going to go your route and temporarily cache the original command line in a separate table and do a check each time before kicking off a job to handle unexpected termination before a succesful restore. It's not ideal but it seems to be the only way I can think of - unless there is any other way I can invoke an SSIS job remotely, programmatically? I think I've covered all the bases - xp_cmdshell, sp_start_job, SMO ...have I missed any? Thank you for your suggestion, it isn't ideal but it's better than creating a temp job that auto-deletes because then I lose the job history. – blue18hutthutt Apr 17 '12 at 06:26

0 Answers0