0

I'm looking to update my dev db to match my prod db at regular intervals (say weekly). Is there a way to do this using Azure itself or do I need to write a script to do this?

Thank you!

DMop
  • 463
  • 8
  • 23
  • 1
    You'd need to take care of this yourself, as there is no built-in sync mechanism. That said: take a look at Cosmos DB's change feed, as you can use that to sync data to another container. Look at Data Factory as well. Plenty of examples are provided in the docs, to show how to use these two services. – David Makogon Dec 17 '20 at 13:19

4 Answers4

1

I have an idea upon your case but I'm not sure if it fits what you want. This is an answer from @David Makogon to other question. How do you think about exporting and importing documents manually? Or trying to create an function to execute these operations which can be activated by time trigger.

I have tried migration tool, it's ok to import data. enter image description here

Tiny Wang
  • 10,423
  • 1
  • 11
  • 29
  • Thank you! I'd rather not do it manually since we want to do it ~once a week. It seems like I'll need to use one of the SDKs to download the data and upsert that to the test database. – DMop Dec 23 '20 at 21:46
  • Yes, wish you solved it soon. In my position, it's not hard for me to sync database manually once a week. And If you find suitable SDK, can you post it as an answer so that it may help more developers? Thanks. – Tiny Wang Dec 24 '20 at 01:51
1

You can use Copy activity and then use schedule trigger to execute it in Azure Data Factory.

1.select upsert as write behavior. enter image description here

2.add a schedule trigger and set running interval according to your need. enter image description here

Steve Johnson
  • 8,057
  • 1
  • 6
  • 17
0

Probably the easiest way these days would be to have the production account set up to use the continuous backup model.

This allows self service point in time restore to a new Cosmos account and can be set to either 30 day retention (chargeable) or 7 day (currently free).

There is a restore cost based on the GB size of data restored but this may well be cheaper than paying for the Request Units involved in reading batches from the source and writing them to the destination (unless the container is big and you are just writing incremental changes).

The main caveats from this are

  • The restore each time will be to a different account so the account you regard as "your dev account" will change over time and you'll need to update connection strings etc.
  • Moving from periodic backup model to continuous is not reversible so you need to make sure you read up on the limitations and are happy with them (incompatibility with analytical store and multi region write are probably the main ones).
Martin Smith
  • 438,706
  • 87
  • 741
  • 845
-1

I would suggest Azure Data Factory .