0

I have an application where some data is stored in a database and some directly on a file system. When a user performs an operation, his action can trigger a change to a database and to some file on a disk (git repository more specifically). Size of the data on a disk has around 200GB whereas the DB is 100MB large. I'm looking for a tool to create a consistent backup of these two data sets. I.e. I cannot have a backup from a time between change to a DB and a write to a file. The OS is linux.

Is there any solution different than a cold backup?

Janek
  • 153
  • 1
  • 4

1 Answers1

2

The usual approach is to tell the application to flush its changes to disk and halt processing. Then you take a snapshot of the underlying filesystem. At this point you can re-enable the application. Then you backup the snapshot.

The same thing can be accomplished by shutting down the application and database.

Without knowing what the application and database server are, that's as specific an answer as you're going to get.

longneck
  • 23,082
  • 4
  • 52
  • 86