5

I'm managing a SQL Server database, via DACPAC releases in Azure DevOps. My project is currently in development but I anticipate development continuing after Prod go-live.

By default I have the Publish configuration set to b0rk if dataloss is about to occur, especially in Prod - the majority of changes shouldn't cause it, and my instinct is that overall DataLoss is more likely to be indicative of a bug, than of intentional data abandonment.

But naturally, I expect that there will be SOME occasions on which a migration knowingly discards data, and that this is expected and OK.


I'd like to set up DevOps in such a way that I can achieve that, in a controlled and robust way. What's the best way to achieve this?

My ideal would be something that essentially said "yes, deploy this release ... Yes, I know that it will cause dataloss, that's fine."

I have one idea, which I'll post as an answer. But I'm looking for other ideas, or any "standard" or "official" approaches. (Or just better ideas :D )

Brondahl
  • 7,402
  • 5
  • 45
  • 74
  • One thing I want to avoid is "I allowed DataLoss in this release, and then forgot to turn it off, so the next Release was also inadvertantly allowing DataLoss". – Brondahl Sep 25 '20 at 11:04

1 Answers1

8

DevOps allows you to pass parameters to SqlPackage.exe, one of which controls how the DACPAC responds to potential DataLoss:

/p:BlockOnPossibleDataLoss=false

Since DevOps Releases pipelines are themselves parameterisable, via release "Variables", and the Variables can be edited within a particular release without affecting the default settings.

So one could simply parameterise the value passed into SqlPackage.exe, and thus create a release which either does or does not allow DataLoss based on that variable, and when you need to release something that does require DataLoss, create a release and set the Variable accordingly.


Edit: Worked Fine

Brondahl
  • 7,402
  • 5
  • 45
  • 74
  • hmm, this is concerning: https://github.com/microsoft/azure-pipelines-tasks/issues/11191 – Brondahl Sep 25 '20 at 13:41
  • We deploy dacpacs using `/p:BlockOnPossibleDataLoss=false` together with `/p:IncludeTransactionalScripts=true` to rollback the change when it failed – Alexander Volok Sep 25 '20 at 14:29
  • Hmm, any update for this issue? It seems that you've gotten the solution yourself, you could [Accept it as an Answer](https://stackoverflow.blog/2009/01/06/accept-your-own-answers/) . It can be beneficial to other community members reading this thread and we can archive this thread for you. Thx~ – LoLance Oct 01 '20 at 07:02
  • BTW, I think your answer is a good idea :) It's recommended to define a **settable** variable `AllowDatalossOrNot` with default value `false` to disable data loss. The command arguments: `/p:BlockOnPossibleDataLoss=$(AllowDatalossOrNot)`. – LoLance Oct 01 '20 at 07:08
  • Then we can use [runtime parameter](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script#use-parameters-in-pipelines)(for yaml pipelines) or [settable variables](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops#how-do-i-specify-variables-i-want-to-edit-when-a-release-is-created)(for Classic UI Release pipelines) to make conditional deploy. – LoLance Oct 01 '20 at 07:10
  • You're not the only one that [loves settable variables](https://stackoverflow.com/a/61606886/10910450). For `But I'm looking for other ideas, or any "standard" or "official" approaches`, as I know there's no so-called `best` or `much better` way, just feel free to test and use your approach since it's working for your scenario. – LoLance Oct 01 '20 at 07:17
  • @LanceLi-MSFT yes, this approach worked just fine for me. – Brondahl Oct 01 '20 at 10:41