1

Looking for the best practice on refreshing a QA/Test Azure SQL Database from a Production Azure SQL Database

The production database is on a different server and resource group. So just wondering the best method for getting the production data into the qa/testing database. What tools are available for a task like this?

pim
  • 12,019
  • 6
  • 66
  • 69
77Vetter
  • 209
  • 3
  • 16

2 Answers2

1

You just need to copy the production database using the portal or PowerShell

New-AzureRmSqlDatabaseCopy -ResourceGroupName "myResourceGroup" `
    -ServerName $sourceserver `
    -DatabaseName "MySampleDatabase" `
    -CopyResourceGroupName "myResourceGroup" `
    -CopyServerName $targetserver `
    -CopyDatabaseName "CopyOfMySampleDatabase"

You can also automate refreshing the development database by recreating it using Azure Automation and the following T-SQL statement.

CREATE DATABASE db_copy   
    AS COPY OF ozabzw7545.db_original ( SERVICE_OBJECTIVE = 'P2' );
Alberto Morillo
  • 13,893
  • 2
  • 24
  • 30
  • Is there a way to exclude a table? we have an audit table with 150Million rows that we dont want/need to be refreshed? Also, I tried running the New-AzureRmSqlDatabaseCopy from CloudShell but it keeps giving me an error about the Resource could not be found. I have the exact Resource from the source Db(copied/pasted) so no sure why it says it cant find the resource? – 77Vetter May 30 '18 at 15:10
  • I am only seeing resources from our production subscription so that is why they resource could not be found error. I tried switching the subscription using: az account set --subscription but that doesnt seem to make any difference, i still only get the production resources. – 77Vetter May 30 '18 at 17:11
  • Use the CREATE DATABASE statement instead. If you don't need the 150 million rows table, truncate it after copying the database. You can automate all this using Azure Automation. https://learn.microsoft.com/en-us/azure/automation/automation-first-runbook-textual-powershell – Alberto Morillo Jun 01 '18 at 02:29
  • https://gallery.technet.microsoft.com/scriptcenter/Azure-Automation-Your-SQL-30f8736b – Alberto Morillo Jun 01 '18 at 02:31
  • @AlbertoMorillo how is that done? we are using create database as copy of command but want to skip two tables – TWilly Feb 11 '21 at 17:50
  • @TWilly The CREATE DATABASE ... AS COPY OF does not allow yo exclude tables. You will have to drop them after the result of the copy database has finished. If those 2 tables are big you may want to recover some space also (SHRINK) – Alberto Morillo Feb 11 '21 at 22:03
1

The most common format of SQL Azure Database's is bacpac, and believe me when I tell you that it is AWESOME.

Exporting

The easiest way to do this is using the Azure Portal or with SSMS.

This will however copy the entire database schema and all data. If you need something more specific, like excluding a table, look no further than sqlpackage.exe.

.\sqlpackage.exe /Action:Export /ssn:SERVER /sdn:ADB /tf:"C:\PATH\TO\FILE.bacpac" /of /p:TableData=TABLE /p:TableData=TABLE /p:TableData=TABLE

Importing

To create a database from the .bacpac you created above, all three of the aforementioned methods also support importing.

Recommendations

I would apply the KISS principle here and just use the portal/SSMS on both ends. Dropping the specific tables you no longer want/need.

pim
  • 12,019
  • 6
  • 66
  • 69
  • 1
    What I ended up doing was using Data Tools in Visual Studio and creating connections to both Production and QA Azure Sql Servers. Then once the connections were there I could right click and select Extract Data Tier Application on the production db. From there I could select/exclude the tables and it created the dacpac file. Then on the QA db I could right click and select Publish Data-tier Application and it brought in the data. This could also be done via SSMS but VS makes it pretty easy. Thanks for getting me pointed in the right direction! – 77Vetter May 31 '18 at 16:09
  • No problem! Dacpac's are tried and true. I'm glad it's all worked out for you! – pim May 31 '18 at 16:22