3

So I've been looking at sql server data tools (ssdt) and seen some info to suggest it was possible to do continuous integration with Microsoft database projects and I'm just wondering if anyone is actually doing this in practice.

The dacpac approach looks really appealing and at first attempt seems feasible but not sure. Looking at visual Studio 2013 with latest ssdt.

Thanks!

Shaunt
  • 183
  • 1
  • 3
  • 11

1 Answers1

5

Yes it is possible and I am currently doing it and have set it up in a number of environments.

If you have any specific questions please ask!

Basic approaches are:

  • Create dacpac on build
  • Use sqlpackage.exe to push to a Sql DB
  • Run tSQLt unit tests
  • Possibly generate scripts to deploy to other environments or deploy to other environments

You can switch sqlpackage for things like redgate sql compare or look at the redgate CI tools if you want an easier setup time - I prefer to run sqlpackage as part of the build and control it myself but that is a personal preference.

Whatever approach you take to creating scripts / deploying databases (either sqlpackage or redgate) the tools are command line tools so integrate with msbuild or any ci script / server.

Ed

Ed Elliott
  • 6,666
  • 17
  • 32
  • Thanks! I had a prototype up and running. I had some issues with some of publish settings. Which settings do you use. Ideally you are just letting the dacpac do all the schema migration right? – Shaunt Mar 12 '15 at 15:27
  • I've used a mixture of getting sqlpackage to do the deploy and also generating scripts and deploying them at a later date - it just depends what you want really. I enable "drop all objects not in source" otherwise you leave old objects - I also filter out some things like filegroups and security so I don't get warnings and I don't have to include them in the project. Deploying to a CI db I always allow data loss - in production I don't generally! – Ed Elliott Mar 12 '15 at 15:42
  • yes, that was part of the issue i was having actually... dba's wanted to manage security and if i had "drop all objects not in source" it seemed like i couldn't ignore security.... sounds like you are saying that it is possible... perhaps i need to look at the settings again. I couldn't seem to get it to only do stored procs, views, tables, schemas, functions, indexes... etc. Any chance you can post your settings? i'd love to see what actually worked so i can compare with what I was using. The few demos i've seen of how to kind of set this up really didn't go into any details on settings. – Shaunt Mar 12 '15 at 20:55
  • 1
    A quick addition to Ed's comments - we (SSDT team) have a presentation and whitepaper on Continuous integration using Team Foundation Server on our blog: http://blogs.msdn.com/b/ssdt/. The whitepaper covers which approaches to use depending on your overall setup. Also note the easiest way to integrate tSQLt is here: http://zogamorph.blogspot.co.uk/2014/07/integrating-tsqlt-with-sql-server-data.html – Kevin Cunnane Mar 13 '15 at 01:45
  • Hey Shaun, if you get the latest version of SSDT there are checkboxes to exclude users etc: http://blogs.msdn.com/b/ssdt/archive/2015/02/23/new-advanced-publish-options-to-specify-object-types-to-exclude-or-not-drop.aspx - before this I wrote a filter and that is what I use but will change over to the new SSDT when I next do something to the build (/p:AdditionalDeploymentContributorArguments="SqlPackageFilter=IgnoreSecurity" - https://the.agilesql.club/Blogs/Ed-Elliott/HOWTO-Filter-Dacpac-Deployments) – Ed Elliott Mar 13 '15 at 08:28
  • Another question... There are cases where you would want to do a single deployment to maybe split a table into master detail. How would you do that via ci? I would think a pre-deployment script to store the data, and delete the data, execute the standard schema changes, and do a post deployment script to load the data. Now how do you get it from trying running this each time? Is there a way to use the build version # or do you just build an if statement on the table you are splitting having fields a, b, c? (this would suck). Or would you add data to a special table to mark you already did it? – Shaunt Mar 31 '15 at 06:50
  • 1
    Hey shaun, the way I do this sort of thing is pre/post deploy scripts and remove them after each major release when I know they have been deployed everywhere (but you can leave them in forever) - just run a query that checks whether it has been run, i.e. if you move column b from table a to table c - do an if exists (select * from sys.columns where name = 'b' and object_id = object_id('a')) - this will only get fired when the column is still on the original table you want to move it away from - "Idempotent all the things!" – Ed Elliott Mar 31 '15 at 07:38
  • Got it... how are managing SQL jobs? I was looking at this article http://blog.sqlconcepts.co.nz/2012/09/ssdt-sql-agent-job-deployment-template.html ... looks pretty promising although i'm not sure how i would test whether the jobs needed updating or not - Any thoughts? – Shaunt Apr 16 '15 at 03:48
  • 1
    @Shaunt we have built our own procs that take params to create jobs, steps and schedules. They always execute and basically drop and recreate the jobs, steps and schedules on every deployment. This way you dont need to determine whether they need to be updated. They will always reflect what is in source control. To answer your earlier question whether people do this in practice, we are a finance company and deploy around 75+ databases through 5 environments this way. Our BI team also use this for the warehouse and ETL's through 3 environments. We use Jenkins, TeamCity and PowerShell for CI – Eugene Niemand May 19 '16 at 09:39