Curious if this is feasible. I am currently in the process of building a number of SQL Server agent jobs. These jobs are just SFTP jobs that pass files between the servers of 2 different clients, making a pit-stop at my local server for some pre and post processing. Yes, this setup is from one standpoint unnecessarily complicated, but it is necessary from a security standpoint. All of these jobs have identical structure:
- SFTP a file from the client1server to the local server.
- Run an executable on the file
- SFTP the processed file to client2server.
- Wait a predetermined amount of time so that client2 can perform their query on the input.
- SFTP the response file from client2server to the local server.
- Run a second executable on the file.
- SFTP the processed response file back to client1server.
Pretty straight forward.
There are only a handful of values that change between each job: - Input/output file path on client1server - Input/output file path on client2server - Directory on local server
These jobs are not complicated, so If necessary I can just create them all by hand. It seems like an unnecessary amount of work though. I had the thought that maybe I could create a stored procedure that would generate the SQL script that creates the job, and that stored procedure could simple accept the variables that change from job to job. Is this feasible?
Broadly, heres what I'm thinking:
CREATE PROCEDURE create_ftp_interface_job
@client1input_fp nvarchar(100),
@client1output_fp nvarchar(100),
@client2input_fp nvarchar(100),
@client2output_fp nvarchar(100),
etc...
AS
<SQL Script for creating SQL Server Agent jobs, with parameters inserted>
GO
I've tried an early version of this, and I seem to be having trouble referencing the variables I declare in the stored procedure definition inside of the SSA job script. I came here to ask if what I'm attempting is feasible, and I just have a run of the mill reference error, or if what I'm doing is not allowed.