0

Okay so I figured since Azure has a lot of tools and I am failing at best processes to just ask on SO.

What I can do:

BACPAC creation

Creation is directly from SSMS with hovering over the database in Object Explorer. Right Click>Tasks>Export Data Tier Application. Choose a disk location. The caveat is generally to not have a database that is in use querying. So have a copy or system that can be not using the database.

BACPAC restore to Azure

  1. SqlPackage.exe https://learn.microsoft.com/en-us/sql/tools/sqlpackage?view=sql-server-2017

    When I do it it is similar to: (must have Visual Studio or equivalent Sql management object DLLs installed) "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130\SqlPackage.exe" /a:Import /sf:(bacpac Location) /tdn:(dbName)/tsn:bvtest.database.windows.net /tu:(user) /tp:(password)

  2. In Azure>SQL Servers>(Top ribbon)>Import database>(have blob storage with a bacpac created to it already). This method appears slightly faster if they are both on same region.

My question is that I have seen articles like this one: https://learn.microsoft.com/en-us/sql/azure-data-studio/tutorial-backup-restore-sql-server?view=sql-server-2017 as well as knowing SSMS pretty well. It seems all options to restore to Azure SQL PAAS are turned off for a 'bak' backup and you have to make a bacpac for going instance to Azure PAAS. The problem is that this takes hours just to make the bacpac on a relatively medium sized database of 120 gigs. Then it takes hours to restore. Other than what I have tried above is there a faster way?

Xavier Poinas
  • 19,377
  • 14
  • 63
  • 95
djangojazz
  • 14,131
  • 10
  • 56
  • 94
  • Is this a one time data migration scenario where you cannot afford a huge downtime? – Kalyan Chanumolu-MSFT Jun 11 '19 at 16:22
  • @KalyanChanumolu-MSFT Yes for the most part. I am working on an existing system with an AZURE SQL PAAS on an elastic pool. So I have some flexibility to potentially take a long time to get initial baseline and then do a differential. But I cannot afford a six hour cutover to do about 3 hours for a bacpac and 3 hours for a restore. – djangojazz Jun 11 '19 at 19:14

1 Answers1

1

You can use Azure Data Migration Service (ADMS) instead of importing/exporting bacpacs. Azure Data Migration Service allows you to migrate data from SQL Server to Azure SQL Database with no downtime if your database does not have "heaps". If your database has heaps you will have to perform an offline migration. Copying/migrating data from a SQL Server ARM (IaaS) Premium Storage (200 GB / 790 tables) to an Azure SQL Database (Premium) takes 1 hour 40 minutes with an offline migration. That is the fastest data migration I have seen as a workaround to import/export bacpacs since SQL Server native backups are not supported on Azure SQL Database (DTU-model / Vcore model).

ADMS performs a validation of your data at the end, if you want.

Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
Alberto Morillo
  • 13,893
  • 2
  • 24
  • 30
  • Thanks! I am trying it now. Surprised I have not ran into this one while I was trying different things. If it works within the time you are claiming I will give you the answer. I am at the Script and deploy and it currently at 400 out of 10k of scripts to run and it kind of jumps through a bunch then halts. I wonder if this is due to running it on an OLDER SQL Azure PAAS several months of migrations back. – djangojazz Jun 11 '19 at 19:11
  • I usually migrate the schema first using Data Migration Assistant v4.3 https://www.microsoft.com/en-us/download/details.aspx?id=53595. Then migrate the data using ADMS. For the fastest experience upgrade your SQL Server IaaS to SSD and use P15 on Azure SQL database. Once your migration finish downgrade to the intended service level. – Alberto Morillo Jun 11 '19 at 19:22
  • The P setting is just the equivalent DTU ramped up all the way right? Is it better or worse to use the Virtual Core versus DTU model in migrations? I am still playing with trying one or the other. – djangojazz Jun 11 '19 at 19:49
  • Looks like this would not work for a migration on an existing db though. When it gets to the 'Select tables' section it will not allow you to migrate tables that have data in the target already. Curious if this works fast with a blank database. – djangojazz Jun 11 '19 at 20:21
  • The database should be blank only with the schema (empty, no data) – Alberto Morillo Jun 11 '19 at 21:05
  • 1
    If you are using the DTU-model or the Vcore model choose to scale up to the higher tier during migration in order to have the best performance. When finish then scale down. – Alberto Morillo Jun 11 '19 at 21:12
  • The DTU model ties the Storage and Compute. You end up paying a lot for the Compute even if all you need is the extra Storage. With the Vcore-model the idea is that you can scale Storage and Compute independently of each other and pay less. Although you need to pay licensing for this model. – Alberto Morillo Jun 11 '19 at 21:14