1

I've been using the following piece of code to get the disk usage starting at a specific directory:

$usage = (Get-ChildItem $webRoot -recurse | Measure-Object -property length -sum)
$diskUsage = $usage.sum

The problem is that this takes a VERY long time compared with simply right clicking a directory in windows and looking at the properties.

Do we have access to the functionality that explorer uses to get disk usage? Or is there another way that will be faster than the method I've been using?

I'm also not sure what will happen if there are circular links in the area my PS code is searching. I assume it could lock up.

FrodoFraggins
  • 49
  • 1
  • 7

3 Answers3

1

Try the "old way":

$fso = New-Object -ComObject Scripting.FileSystemObject
$fso.GetFolder($webRoot).Size
Shay Levy
  • 121,444
  • 32
  • 184
  • 206
0

I doubt it can be made faster. Disk space analysis tools (like TreeSize) take a while to churn through the directories on the disk.

northben
  • 5,448
  • 4
  • 35
  • 47
  • It's strange if the OS can't make their fast calculations available to powershell. – FrodoFraggins Apr 02 '13 at 13:51
  • I don't think the OS has a faster calculation. The single value for total space free/used for the entire disk is kept up to date with changes to the file system. But imagine if the OS tried to keep tabs on that information at every level of the filesystem: let's say you change a single file in a directory 5 levels deep -- that would require 5 calculations. ... Now extrapolate that out to something like opening Visual Studio - even an SSD couldn't handle that! :) – northben Apr 02 '13 at 13:58
0

Even when you right click and view properties Explorer still has to calculate the size, that's what happens for me on my network drive. It takes a while because the are 1,000,000+ files and 500,000+ directories.

Using the command you supplied it first gets all the files and folders and stores them to memory. Just doing $files = gci $path -r takes a long time because there are so many files. Then, your command passes the big memory chunk to Measure-Object. Finally, Measure does the calculation. So you're waiting for a while with no progress.

PowerShell processes a ForEachObject cmdlet, it processes each value as it passes through the pipeline, so it uses less memory at any given time.

What you can do is use a Foreach-Object % loop to process the size, and have it output what it already has calculated. Printing text to the screen always slows things down but at least you are seeing progress. Remove the cls; "Total: $($total/1GB)GB" for more efficiency.

gci $path -file -r | % {$total += $_.length; cls; "Total: $($total/1GB)GB"}

Run a Measure-Command on your command and the other commands given as answers and see which one is the fastest.

E.V.I.L.
  • 2,120
  • 13
  • 14