0

Any one have some idea which is the best way to implement continuous Replication of some DB tables from Azure SQL DB to Azure SQL DB(PaaS) in incremental way.

I have tried Data Sync preview (schema is not loading even after couple of hours), Data Factory (Copy Data) - Fast but it is always copying entire data(duplicate records) - not an incremental way.

Please suggest.

BrianAtkins
  • 1,279
  • 9
  • 17
VIJAY RAAVI
  • 388
  • 3
  • 11
  • did you check this out already ? https://blogs.msdn.microsoft.com/sql_server_team/transactional-replication-to-azure-sql-db-is-now-in-public-preview/ – Aravind Jul 21 '16 at 06:59
  • @Aravind, I have checked it, its between on-prem and Azure SQL(PaaS), – VIJAY RAAVI Jul 21 '16 at 09:47
  • I need replication between Azure SQL(PaaS) to Azure SQL(PaaS) – VIJAY RAAVI Jul 21 '16 at 09:48
  • if you are looking at a backup and Disaster recovery option there is Active geo replication feature. If you are looking at copying data across check this question http://stackoverflow.com/questions/6572754/sql-azure-copy-table-between-databases – Aravind Jul 21 '16 at 09:57
  • I'm not interested in geo replication feature, Looking for replication of some DB tables between two different Azure SQL Databases(These DB's have 300+ tables in it). – VIJAY RAAVI Jul 22 '16 at 14:18

3 Answers3

2

What is the business requirement behind this request?

1 - Do you have some reference data in database 1 and want to replicate that data to database 2?

If so, then use cross database querying if you are in the same logical server. See my article on this for details.

2 - Can you have a duplicate copy of the database in a different region? If so, use active geo-replication to keep the database in sync. See my article on this for details.

3 - If you just need a couple tables replicated and the data volume is low, then just write a simple PowerShell program (workflow) to trickle load the target from the source.

Schedule the program in Azure Automation on a timing of your choice. I would use a flag to indicate which records have been replicated.

Place the insert into the target and update of the source flag in a transaction to guarantee consistency. This pattern is a row by agonizing row pattern.

You can even batch the records. Look into using the SQLBulkCopy in the system.data.sqlclient library of .Net.

4 - Last but not least, Azure SQL database now supports the OPENROWSET command. Unfortunately, this feature is a read only from blob storage file pattern when you are in the cloud. The older versions of the on premise command allows you to write to a file.

I hope these suggestions help.

Happy Coding.

John The Crafty DBA

CRAFTY DBA
  • 14,351
  • 4
  • 26
  • 30
0

If you wanted to use Azure Data Factory, in order to do incremental updates, you would need to change your query to look at a created/modified date on the source table. You can then take that data and put it into a "Staging Table" on the destination side, then use a stored proc activity to do your insert/update into the "Real table" and finally truncate the staging table.

Hope this helps.

JustLogic
  • 1,738
  • 2
  • 12
  • 24
0

I am able to achive Cloud to Cloud Migration using Data Sync Preview from Azure ASM Portal Below are the limitations

  • Maximum number of sync groups any database can belong to : 5
  • Characters that cannot be used in object names : The names of objects (databases, tables, columns) cannot contain the printable characters period (.), left square bracket ([) or right square bracket (]).

Supported limits on DB Dimensions

Reference: http://download.microsoft.com/download/4/E/3/4E394315-A4CB-4C59-9696-B25215A19CEF/SQL_Data_Sync_Preview.pdf

VIJAY RAAVI
  • 388
  • 3
  • 11