1

We have a database project that uses a dacpac to deploy schema changes and also allows a pre-deployment and post-deployment script.

However, we frequently have to run one-off scripts and security would prefer that developers not have write access in prod (we do not have DBA role at this time). I'm trying to find a solution that would work with azure devops to store one-time run scripts in git, run the script if it has not been run before, and not run the script the next time the pipeline runs. We'd like this done through devops so the SP has access to run the queries and not the dev, and anything flowing through the pipe has been through our peer review process, plus we have record of what was executed.

I'm looking for suggestions from anyone who has done this or is aware of any product which can do this.

Josh
  • 16,286
  • 25
  • 113
  • 158

2 Answers2

1

Use liquibase. Though I would have it as part of my code base you can also use it from the CLI and run your scripts using that tool.

Liquibase keeps track of what SQL files you have published across deployments so you can have multiple stages say DIT, UAT, STAGING, PROD and it can apply the remaining one off SQL changes over time.

Generally unless you really need support, I doubt you'd need the commercial version. The opensource version is more than sufficient for my system needs and I have a relatively complex system already.

The main reason I like liquibase over other technologies is it allows for SQL based change sets. So the learning curve is a lot lower.

Two tips:

  1. don't rely on the automatic computation of the logicalFilePath, explicitly set it even if it is repeating yourself. This allows you to refactor your scripts so instead of lumping everything into a single folder you may group them later on.

  2. Name your scripts with the date first. That way you can leverage the natural sorting order.

Archimedes Trajano
  • 35,625
  • 19
  • 175
  • 265
0

I've faced a similar problem in the past:

Option 1

If you can afford to have an additional table in your database to keep track of what was executed or not, your problem can be easily solved, there is a tool which helps you: https://github.com/DbUp/DbUp

Then you would have a new repository let's call it OneOffSqlScriptsRepository and your pipeline would consume this repository:

resources:
  repositories:
  - repository: OneOffSqlScriptsRepository
    endpoint: OneOffSqlScriptsEndpoint
    type: git

Thus you'd create a pipeline to run this DbUp application consuming the scripts from the OneOffSqlScripts repository, the DB would take care of executing the scripts only once (it's configurable). The username/password for the database can be stored safely in the library combined with azure keyvaults, so only people with the right access rights could access them (apart from the pipeline).

Option 2

This option assumes that you wanna do everything by using only the native resources that azure pipelines can provide.

  1. Create a OneOffSqlScripts as in option1
  2. Create a ScriptsRunner repository
  3. In the ScriptRunner repository, you'd create a folder containing a .json file with the name of the scripts and the amount of times (or a boolean) you've had run them.

eg.:

[{
 "id": 1
 "scriptName" : "myscript1.sql"
  "runs": 0 //or hasRun : false
 }]

Then write a python script that reads and writes a json file by updating the amount of runs, thus you'd need to update your repository after each pipeline run. It would mean that your pipeline will perform a git commit / push operation after each run in case there new scripts to be run. The algorithm is like these, the implementation can be tuned.

Bruno
  • 924
  • 9
  • 20