5

So we are working on adding SQL DACPACs as part of our continues integration. We do this with powershell script running "sqlpackage.exe" with each DacPac file. we have about 12 DBs so it's about 12 DacPacs.

Everytime we run "sqlpackage.exe" to publish or script the DacPac we notice that it will take between 0.5-1+ min to complete initialization. Most of this time is being taken during initialization flow when "sqlpackage.exe" begins.

i'm trying to find a way to reduce this if possible since we have 12 DBs we are talking about at least 12 min for DB Deployment which is too much for us.

Do you know of any way to be able to reduce this?

JarJarrr
  • 400
  • 3
  • 18
  • 1
    Have you considered executing the updates simultaneously (in parallel)? – Steven Green Aug 06 '15 at 16:20
  • Also I use a similar approach and also generating the script from dacpac takes a long time. I have a powershell script to run scripting jobs in parallel in groups of four. Note that it uses up to 100% our CPU when building four scripts at once. – scar80 Aug 07 '15 at 06:18
  • And along those lines, have you considered not pushing changes if there are no actual changes? It's a little more to keep track of up front, but that could also reduce your time to release if you're not updating each database. – Peter Schott Aug 07 '15 at 22:06
  • I ran the dacpacs in parallel, the issue is that it seems there are dependencies between the different DBs. So it seems that running parallel is a good idea but you really need to be aware of the dependencies within the your databases. – JarJarrr Sep 08 '15 at 06:48

2 Answers2

1

The time up front is, I believe, used to get the current schema from the database, compare it to what's in the dacpac, and come up with the change scripts required. There is no way to eliminate that using a dacpac.

However, you could do it ahead of time by having SqlPackage create the upgrade scripts and then at deploy-time just run the scripts. This could reduce downtime to just the time it takes to run the script, and, as suggested, if run in parallel the down time could be dramatically reduced.

If all your DBs are guaranteed to be in the same (schema) state you could just create a single upgrade script off the first DB and run it on all DBs.

Mark
  • 2,926
  • 3
  • 28
  • 31
  • that is an interesting idea to create the scripts prior to the deployment. The only issue is that we want this to be part of continues integration and deployment so creating them before is not really possible. – JarJarrr Sep 08 '15 at 06:40
0

This might be too late but potentially useful for others - we found that opening up all ports in the firewall on the SQL server solved our problem of deployment taking a long time. Went from about 4-5 minutes to about 1 minute.

benpage
  • 4,418
  • 3
  • 41
  • 39
  • Can you say which Ports you opened when you say "we found that opening up all ports in the firewall on the SQL server" – Gwasshoppa Dec 19 '17 at 02:19
  • as in, we disabled the windows server firewall - so all ports are open now. for clarity, it was the firewall on the sql server we disabled, not the firewall on the web server (which is doing the release). – benpage Dec 20 '17 at 02:46