1

I have a windows server machine that takes daily backup daily and each daily backup takes 1.5gb thus every night, I want to remove backup files that are older than 1 week.

This is how my back up files are organised:

backup.20091118.gz.gpg , as you can see 20091118 part is the one identifies the file date which is 2009/11/18 (year,month,day) .

I am planning to write a quick batch script for this and schedule it via task manager, is this a good idea? If so I would be greatful for assistance at the batch script part.

Best Regards

Hellnar
  • 143
  • 1
  • 6

5 Answers5

2

Instead of worrying about the age of the files, first delete old backups until there are only 7 daily backups remaining and then worry about deleting the oldest file in the directory before we do each new backup.

Deleting the oldest file in a directory is pretty easy to do in batch script:

SET BACKUPDIR=C:\PATH\TO\BACKUPS
FOR /F %%i IN ('DIR /B /O-D %BACKUPDIR%') DO SET OLDEST=%%i
DEL %BACKUPDIR%\%OLDEST%

The only real trick is the command DIR /B /O-D which lists plain file names sorted by date, oldest last. We use the FOR loop to capture each file name in the OLDEST variable so when the loop is done %OLDEST% will expand to the name of the oldest file.

David Webb
  • 421
  • 3
  • 5
  • seems like a clever idea, I wonder if it removes all files inside a folder that are older than 7 days? –  Nov 18 '09 at 09:14
  • @Hellnar - the script only removes the single oldest file in the directory, whether it is 5 minutes or 12 months old. It'll work as solution here based on the assumption there are 7 daily backups in the directory and the script is run once per day. – David Webb Nov 18 '09 at 09:31
1

Delete the oldest file right before you create the newest backup.

File deletion is quick, much quicker than backing up ~1.5GB, and the backup is already scheduled.

0

the "quickest" way?? A suggestion. download GNU find from here

then simply create scheduled task like this:

gnu_find.exe c:\path -type f -iname "backup.*gpg" -mtime +7 -delete
user37841
  • 341
  • 1
  • 2
0

http://winhlp.com/node/180

http://windowsitpro.com/Files/07/40511/Weblisting_01.txt

http://www.shell-tips.com/2006/09/27/delete-old-files-by-last-access-date/

https://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days

  • Whilst this may theoretically answer the question, [it would be preferable](http://meta.stackoverflow.com/q/8259) to include the essential parts of the answer here, and provide the link for reference. – Mark Henderson Jun 16 '14 at 08:18
0

You might want to look into smarter and more efficient backup technology than just archiving complete snapshots. OS X has Time Machine which can make hourly backups. It only stores changed files, so despite being uncompressed its very space efficient. Due to some clever indexing, after the initial backup its far faster than a full snapshot. And because it just stores the files, no fancy archive format, recovering a file from backup is as easy as copying a file.

There's likely something similar for Windows. Seagate Replica and Genie Timeline are two possibilities.

Schwern
  • 101
  • 3