0

EDIT Hoping here to clarify my convoluted and misleading question... based on my mistaken assumption that -file accepts inputs. Thanks for setting me straight and pointing out that it's just a switch parameter; the inputs in my example actually get passed to -path. Sounds like that may be the fastest purely powershell way to search for multiple file types, since -filter accepts only a single input and -include is slower.

The get-childItem documentation says "Filters are more efficient than other parameters, because the provider applies them when retrieving the objects, rather than having Windows PowerShell filter the objects after they are retrieved."

v3 has a new parameter set with a -file parameter, probably meant for excluding directories, to match cmd.exe's dir /a:-d

Like -include and unlike -filter, -file accepts multiple, as in gci -file "*.ldf","*.bak"

So i'm wondering, and have thus far failed to reliably test, if -file is like -filter from a performance perspective, ie "more efficient", or more like the "other parameters" like -include. If -file is a filter, that's nice because afaict -filter only handles one filter at a time, so if you want multiple filters (like *.ldf and *.bak) then you need to either run gci -filter twice or use -include instead. So I'm wondering if -file lets us reap the efficiency benefits of a filter, for multiple filters.

I stumbled on some error text that's got me optimistic. The -file parameter wants -path to be the current directory, so this gci -path $path -file "*.bak","*.ldf" gives an error. Push-location seems like a viable workaround, but here I'm more interested in the content of the error text:

Get-ChildItem : Cannot convert 'System.Object[]' to the type 'System.String' required by parameter 'Filter'. Specified method is not supported.

I called -file but the error complains about "parameter 'Filter'". So maybe -file is efficient like a filter? OTOH, -filter doesn't need -path to be the current directory, so in that respect -file is more like -include.

noam
  • 1,914
  • 2
  • 20
  • 26

1 Answers1

3

just one precision:

gci -path $path -file "*.bak","*.ldf"

-file is a switch parameter (as -directory is) and doesn't accept values (To get only files, use the File parameter and omit the Directory parameter. To exclude files, use the Directory parameter and omit the File parameter); then the "*.bak","*.ldf" are implicitly passed to -filter as value, and filter accept only string and not string[]. That's where your error came from.

Regarding performance: using -file or -directory is faster than using a where-object psiscontainer or ? { !$_.psiscontainer} because it's done at provider (filesystem in this case) level.

CB.
  • 58,865
  • 9
  • 159
  • 159
  • Another way to list only directory is `dir -attributes d`; to list only files is `dir -attributes !d`. Or filter for any other file attributes and any combinations of them. – CB. Dec 07 '12 at 20:58
  • Thanks, that's good news. If -file inputs are implicitly passed to -filter, then -file should indeed offer the performance benefit of a filter, for multiple filters... whereas when calling -filter directly, we were limited to a single filter input (*.ldf or *.bak but not both). – noam Dec 07 '12 at 22:07
  • @noam the benefit it's only to exclude directory in this specific case. In other case "*.bak","*.ldf" is assigned to the first parameter that accept string[] – CB. Dec 07 '12 at 22:22
  • so this: `gci -file "*.ldf","*.bak"` is equivalent to this `gci -file -path "*.ldf","*.bak"` ? – noam Dec 07 '12 at 22:36
  • @noam the difference is that is there are folder with name ending with .ldf or .bak with -file param are excluded. follow you?? – CB. Dec 07 '12 at 22:56
  • Yeah, thx for bearing with me. Now I see that -file is just a switch parameter. So it sounds like the most efficient (purely powershell) way to search for both bak and ldf files would be to use `-path "*.ldf","*.bak"` from the current directory. Or am I missing a better way? – noam Dec 07 '12 at 23:17
  • @noam IMAO it is. adding `-file` some better performance because folder are skipped. For finding files in deep folder recursion by command line the best is using `cmd.exe /c dir *.ldf,*.bak /s` and eventually parse the result in powershell to cast returned value to `[fileinfo]`, but it's just an opinion! – CB. Dec 07 '12 at 23:22
  • @noam in v3 `get-chiditem -path *.ldf,*.bak -r -File` it's the best for deep folder search IMO. – CB. Dec 07 '12 at 23:30