1

I am trying to recursively extract some file information in my file server. I was able to run the commands below at my own laptop. When I run them on my file server, which is going thru 4TB of data, they run or stuck for hours.

When I use the program call TreeSize to look at the directories size, it goes through all 4TB of data and display the usage in less than 10 minutes.

My question is that is there way to extract the file information in 4TB of data using cmd or powershell and as fast as the TreeSize program?

forfiles /s /d -2/21/2017 /c  "cmd /c echo @fdate, @ftime, @path" >  ./myfile.csv

dir /q /s /o:d > ./myfile2.txt
hello
  • 11
  • 2
  • 1
    Have you considered installing the FSRM feature and using some of the reports it offers? https://technet.microsoft.com/en-us/library/cc771212(v=ws.11).aspx – Clayton Mar 29 '17 at 14:13

1 Answers1

2

You are executing a new cmd for every file which is terribly expensive. This is not new:

There are some disadvantages to using CMD.exe with FORFILES, a new process will be created and destroyed for every file that FORFILES processes, so if you loop through 1000 files, then 1000 copies of CMD.exe will be opened and closed, this will affect performance.

In contrast size utilities like TreeSize have been optimized for speed:

TreeSize Free works on the MFT (Master File Table) and reaches extremely high scan speeds. Scanning operations run in a thread, so you will see results almost instantly while TreeSize Free is working in the background.

You can do better by not calling more executables. PowerShell is great at this, even at my novice level:

Get-ChildItem D:\path\ -recurse |
Where-Object {$_.CreationTime -and $_.CreationTime -lt "2/21/2017" } | 
Select-Object FullName, LastWriteTime |
Export-Csv myfile.csv

The desired output format and the other file are an exercise for the reader.

John Mahowald
  • 32,050
  • 2
  • 19
  • 34
  • How do I use MFT to extract the file information? – hello Mar 23 '17 at 06:29
  • 1
    The Master File Table is a low level component of NTFS, and not something easy to read directly. Focus on not calling a program on every file, and speeding up your storage system first. Then you can read a book on NTFS if you want. – John Mahowald Mar 23 '17 at 13:05