So in the work of doing backups, I need a batch script that would allow me to delete files in a specified directory, that are older than lets say, 3 days. This script will be set as a scheduled task to run at a specified time every day.
-
Are you tied to batch? What about powershell? – Nate Apr 14 '11 at 15:39
-
Powershell should be able to work, I'm just used to batch and I've never played with Powershell. – Chiggins Apr 14 '11 at 15:58
8 Answers
forfiles -p c:\pathtofiles\ -m *.rar -d -5 -c "cmd /c del @path"
Where -5
is the age of the files you want to delete (5 days or older in this case). This script is deleting .rar
files - drop the -m *.rar
if you want to delete any file type.

- 68,823
- 31
- 180
- 259
-
1Of all these answers this is the only one (so far) that directly answers the question. – John Gardeniers Feb 20 '12 at 19:00
If powershell is acceptable (should be, as its enabled by default on Server 2008+) try this:
$numberOfDays = 3
$Now = Get-Date
$TargetFolder = “C:\myoldfiles”
$LastWrite = $Now.AddDays(-$numberOfDays)
$Files = get-childitem $TargetFolder -include *.bak, *.x86 -recurse | Where {$_.LastWriteTime -le “$LastWrite”}
foreach ($File in $Files)
{
write-host “Deleting File $File” -foregroundcolor “Red”;
Remove-Item $File | out-null
}
Souce here.

- 2,151
- 6
- 26
- 41
-
2+1. Also, for testing purposes you can add the "-whatif" flag: `Remove-Item $File -whatif | out-null`. To run as a sched task, the job should specify the full path to powershell.exe with your script named as the argument. – AndyN Apr 14 '11 at 16:29
-
I came up with the same solution to start with -- but ran into problems with the memory load associated with getting all the files up front. – Tim Barrass Mar 25 '15 at 14:41
If you insist on using batch files, Robocopy.exe is your answer. Its fast (multithreaded) and very robust. For your scenario you can use the following as a guide :
:: Moves dir & files older than 3 days to i:\Destination
:: Wildcards acceptable
robocopy i:\Source\ i:\Destination\ /MOVE /MIR /MINAGE:3 /ETA
:: Removes the destination tree
rd /s /q i:\destination
There is a long list of options, please do robocopy /? to see them all. You can even use it to do incremental backups, scheduling, creating backup profiles, etc.

- 161
- 2
- 6
-
All the +1's for robocopy! <3 Re your example: I'm not sure if the /MIR switch is right for this situation, the /MOVE should take care of Chiggins' question. One things for sure, robocopy has a ton of options, careful examination is required. – JamesCW Feb 22 '12 at 19:52
You might look at Horst Schaeffer's DelAge32:
http://home.mnet-online.de/horst.muc/wbat32.htm#top
DelAge32 - ver. 2.3 (c) 2003-2008, Horst Schaeffer
Deletes or moves files (path with file pattern) by age (number of days)
Syntax: DelAge32 filespec age [options]
Options:
/created /accessed /modified (default) - file stamp used to evaluate age
/includeRO - include read-only files
/includeH - include hidden files
/includeS - include system files
/includeRHS -include read-only, hidden and system files
/recurse - include subdirectories
/subonly - /recurse excluding initial directory
/rd - remove empty subdirectories
/move path - move files to specified path
/preview - list, but no action
/quiet - no output
Your command can be as simple as:
delage32.exe c:\logdirectory\*.log 3
I have this command running as a scheduled task.

- 2,806
- 1
- 19
- 22
-
-
Delage32.exe Gets a 0/53 rating at virustotal.com, where zero means no threats detected among the top 53 AV tools. You may want to change your AV to something more mainstream. – RobW Aug 12 '16 at 22:42
-
It is not my choice. I'm just warning for anyone that uses Cylance to either waive delage32 or use forfiles – Sun Aug 12 '16 at 22:54
This is a powershell script I wrote to do what you want - it does a bit more too. I use it to clear down logs and other temporary files.
purge-dem-logs.cmd
powershell.exe -command "& 'c:\purgelogs\purgelogs.ps1' -Filepath D:\SQL\backup\ -filemask *.bak -Maxdays 14 "
purgelogs.ps1:
Param ($filepath, $filemask, $maxdays, [switch]$recurse)
if (($FilePath -eq $null) -or ($FileMask -eq $null) -or ($MaxDays -eq $null)) {
write-host "Usage .\purgelogs.ps1 -filepath [Path] -filemask [Mask] -masdays [Max Days]"
write-host " "
write-host "Example: "
write-host " .\purgelogs.ps1 -filepath c:\temp -filemask *.log -maxdays 30"
break
}
if (Test-Path $FilePath) {
$FilePath += "*"
$Now = Get-Date
$LastWrite = $Now.AddDays(-$MaxDays)
write-host "Last write time " $LastWrite
if ($recurse) {
$Files = get-childitem $FilePath -include $FileMask -recurse | Where {$_.LastWriteTime -le "$LastWrite"}
} else {
$Files = get-childitem $FilePath -include $FileMask | Where {$_.LastWriteTime -le "$LastWrite"}
}
if ($Files -eq $null) {
write-host "No matching files found"
} else {
foreach ($File in $Files)
#You can add -whatif to see the consequence û Remove-item $File -Whatif
{
write-host "Deleting File $File" -foregroundcolor "Red"; Remove-Item $File | out-null
}
}
}
Else
{
Write-Host "The Folder $FilePath Does Not Exist!"
}

- 1,349
- 4
- 19
- 30
This will not work for remote computers. Admins need to manage multiple computers. Below is the script that can be used to delete folders in multiple remote computers without having to login to them.
Below script will delete folders older than 15 days. you can change the $days parameter though.
D$\Program Files (x86)\Research In Motion\BlackBerry Enterprise Server\Logs is the UNC path for Blackberry Log folder. You can change the directory where your logs/folders are located.
List all your server names in servers.txt file and it should located in the same directory as this script.
cd C:\Scripts\Powershellscripts\deletefiles ----> change it to the directory you wanna out this script to
$Days = "15"
$Now = Get-Date
$LastWrite = $Now.AddDays(-$days)
$server = get-content servers.txt
foreach ($node in $server)
{
get-childitem -recurse "\\$node\D$\Program Files (x86)\Research In Motion\BlackBerry Enterprise Server\Logs" | Where-Object {$_.LastWriteTime -le $LastWrite} | remove-item -recurse -force
}
Save the script as .ps1 and run it. You can schedule it via batch file. That way you need to add Change Directory command at the beginning of the script.
Have fun.

- 27,458
- 12
- 55
- 109

- 11
- 1
As an alternative approach: instead of relying on querying the filesystem to get file creation times (and hitting the same files over multiple days, until they expire) you could add the file to an index of your own at time of creation. The index could potentially be as simple as a file named after the creation date, stored in a known location, with a file per line.
If you have a multithreaded/multiprocess app creating files, then you might want your index handled in a more sophisticated way.
The advantage would be that you always have a relatively simply-processed list of files created on a certain day that you can iterate over, rather than having to ask the filesystem again and again for details.
(This would rely on the app, and file creation, being managed by you, and not by a third party).

- 101
- 2