I am trying to recursively extract some file information in my file server. I was able to run the commands below at my own laptop. When I run them on my file server, which is going thru 4TB of data, they run or stuck for hours.
When I use the program call TreeSize to look at the directories size, it goes through all 4TB of data and display the usage in less than 10 minutes.
My question is that is there way to extract the file information in 4TB of data using cmd or powershell and as fast as the TreeSize program?
forfiles /s /d -2/21/2017 /c "cmd /c echo @fdate, @ftime, @path" > ./myfile.csv
dir /q /s /o:d > ./myfile2.txt