6

Is it possible to deploy different sets of seed data for different publish profiles using visual studio Sql Server Data tools database project?

We know you can deploy seed data using a post deployment script. We know you can deploy to different environments using the publish profiles facility. What we don't know is how you can deploy different seed data to the different environments.

Why would we want to do this?

  • We want to be able to do this so we can have a small explicit set of seed data for unit testing against.
  • We need a wider set of data to deploy to the test team's environment for the test team to test the whole application against
  • We need a specific set of seed data for the pre-prod environment.
Dib
  • 2,001
  • 2
  • 29
  • 45

2 Answers2

8

There are a few ways you can achieve this, the first approach is to check for the environment in the post deploy script such as..

if @@servername = 'dev_server'
begin
   insert data here
end

A slightly cleaner version is to have different script files for each environment and importing them via the :r import sqlcmd script so you could have:

  • PostDeploy.sql
  • DevServer.sql
  • QAServer.sql

then

if @@servername = 'dev_server'
begin
   :r DevServer.sql
end
if @@servername = 'qa_server'
begin
   :r QAServer.sql
end

You will need to make sure the paths to the .sql files are correct and you copy them with the dacpac.

You don't have to use @@servername you can use sqlcmd variables and pass them in for each environment which again a little cleaner than hardcoded server names.

The second approach is to moodify the dacpac to change the post delpoy script with your environment specific one, this is the my preferred and works best as part of a CI build, my process is:

  1. Check-in changes
  2. Build Server builds dacpac
  3. Build takes dacpac, copies to the dev,qa,prod, etc env folders
  4. Build replaces the post-deploy script in each with the env specific script

I call the scripts PostDeploy.dev.sql, PostDeploy.Qa.sql etc and set the Build action to "None" or they are added as "Script, Not in Build".

To replace the post-deploy script you just need to use the .net Packaging API or for some examples take a look at my Dir2Dac demo which does that and more:

https://github.com/GoEddie/Dir2Dac

more specifically:

https://github.com/GoEddie/Dir2Dac/blob/master/src/Dir2Dac/DacCreator.cs

var part = package.CreatePart(new Uri("/postdeploy.sql", UriKind.Relative), "text/plain");

     using (var reader = new StreamReader(_postDeployScript))
     {
                reader.BaseStream.CopyTo(part.GetStream(FileMode.OpenOrCreate, FileAccess.ReadWrite));
     }
Ed Elliott
  • 6,666
  • 17
  • 32
  • Thank you for your detailed answer. For the time being I think I'll go with the multiple script method you have described under "A slightly cleaner version is to have different script files for each environment". maybe we will look at the DacPac method as needs arrive. – Dib Nov 16 '15 at 11:42
  • I'm going to need to look at that latter option you're using, Ed. Considering the possible size for some of these initial population scripts, that would be a HUGE amount of data to include in every script. We're considering just looking at the build variable if it's "New" and running the new things post-publish right now (which might be the better/easier choice for the moment). – Peter Schott Nov 16 '15 at 23:15
2

I have solved this by writing a Powershell script that gets executed automatically when publishing, by an Exec Command in the Project-file.

It creates a script file, which includes all scripts found in a folder in the project (the folder is named like the target Environment).

This script is then included in the post-deploy script.

SAS
  • 3,943
  • 2
  • 27
  • 48