3

How can I watch a file or folder for changes, and when any updates occur, back up a copy of that file?

For example, when c:\orders\orders.xml is created or updated, write a copy to orders.xml.yyyymmddhhmmss

HopelessN00b
  • 53,795
  • 33
  • 135
  • 209
Mark Richman
  • 286
  • 1
  • 5
  • 15
  • If you are writing orders to an xml file, wouldn't you just query that xml file for a report later of the details/rows/orders? – TheCleaner Nov 24 '13 at 16:38
  • ...or does the file get overwritten? If so, how is the time window sized between writes? Also, when you write file or folder, is that one file in a folder or several, and are their names known or unkown? – ErikE Nov 24 '13 at 17:04
  • The file is overwritten with the same name and different content every *N* minutes. I have an internal request for this functionality, so I'm not going to question its usefulness. – Mark Richman Nov 25 '13 at 14:16
  • Then I would go with MDMarras solution or do a script which compares the filehash value. If the value changed copy to a timestamped name. Get-FileHash in ps v4 might be that ticket: http://technet.microsoft.com/en-us/library/dn520872.aspx – ErikE Nov 26 '13 at 21:14

3 Answers3

3

Enable file system auditing for each and every event that you want this to trigger on. Then, create an event trigger with a script attached to it for each relevant event ID (this is insane, btw).

Or, you could just periodically use a regular backup tool like everyone else does. Plenty of vendors use snapshot-based continuous backup with 5 minute protection intervals now including DPM, Commvault, Falconstor, etc. A combination of that and volume shadow copies should get you file history and data protection.

MDMarra
  • 100,734
  • 32
  • 197
  • 329
  • 1
    The issue I have with this is that the file may change more frequently than the backup interval. Why is the event trigger and script approach insane? – Mark Richman Nov 24 '13 at 03:22
  • I think it's just unorthodox, which translates into a slight leap into the not equally well known. So it could be brilliant if it exactly meets your need, but would proscribe a tad extra testing to find if there are quirks to be known before going live. – ErikE Nov 24 '13 at 17:14
  • I have the audit set up and working. How do I attach a script to it? I just need to make a copy of a file on every change. – Mark Richman Nov 25 '13 at 14:15
  • http://technet.microsoft.com/en-us/library/cc748900.aspx – ErikE Nov 26 '13 at 21:09
0

If you are open to using a hosted service for this, Crashplan will watch files in real time and backup upon change:

http://support.code42.com/CrashPlan/Latest/CrashPlan_Glossary#real-time

Also gives you the added benefit of off site backup (assuming they meet the security) requirements of the data in question).

wjimenez5271
  • 729
  • 2
  • 6
  • 16
0

In addition to MDMarras event-id idea which clearly has something, I like running these as task scheduled perpetually running background jobs:

1) Running robocopy /mir on a folder with the /mon: and/or the /mot: options which turns robocopy into a continuous monitoring process. I do this on my office workstation for stuff that wouldn't end up in the serverside backup jobs otherwise, and to sync down local copies of things like installers which can be good to have at hand. I thought it would be a quick hack until I replaced it with something more robust, but it works so well I'll stick with it.

2) For monitoring text file content changes, one can use get-content with the -wait to keep a continuous watch and possibly also -tail option (to avoid parsing the entite file when the job is restarted). As changes are detected, actions can be triggered. This makes it possible to have logic such as file renames to include timestamps. I use this to monitor logs which works fine.

I use the task scheduler to launch at startup, then start the process every 5 mins but only allow one instance. This keeps the jobs always running but without spawning multiple identical processes.

ErikE
  • 4,746
  • 1
  • 20
  • 27
  • With mirroring via robocopy, how would you handle the need for multiple copies of the file with the date as the file extension? – MDMarra Nov 24 '13 at 14:36
  • One wouldn't. The powershell alternative would be able to, but to do the same via folder monitoring I guess one would script the whole thing. The question is slightly ambiguous too in this sense. – ErikE Nov 24 '13 at 15:17