0

We are working on a project where we are moving from a old legacy system to a new system.

So prior going live we do data compare between old system database and non-prod database(we enabled new potential production flow in the non-prod environment for ~2 weeks)

When are are comfortable, we plan a release where we'll migrate to new system and we'll copy the 2 weeks of data that we had in non-prod in actual production system.

Now, with this approach there are chances that the manual insert scripts will face issues and may probably corrupt/delete data. I feel this is a very risky approach as my production database already has huge amount of data.

Is there are any guidelines/ideas to migrate such a system to production more reliably ? or database insert scripts are my only choice?

ATR
  • 2,160
  • 4
  • 22
  • 43

1 Answers1

0

You should always back up DBs before making big changes, so you can roll back if anything goes wrong. Users should expect system down times if you announce it ahead of time.

I can think of two basic ways to do this, though depending on which RDBMS you're using you may have other/better options:

  1. Do a data dump on the source table, then reload that table into the new database. Then run an INSERT INTO SELECT to insert the rows of the old table into the new one.

  2. Do a data dump on the source table, then edit the file to remove any statements that would clear the table, then directly load the data into the new table.

Either way, make sure to do a back up first, and depending on the size of your data, it may be beneficial (save time) to remove the indexes on the tables, and rebuild them after the bulk insert has been done.

If you can tell me which RDBMS you're using, I could look into more options for you. If you're nervous, you can always imitate your work environment at home and do practice runs.

Yidna
  • 452
  • 4
  • 12