0

Using PowerShell, I am downloading and extracting a file that has a directory and another file in it (it's basically from https://aka.ms/downloadazcopy-v10-windows). I'd like to be able to get into the directory after extraction.

So, in PS, I am at c:\AzCopyTest while downloading the file. It is being extracted at the same location. Here's the code for it:

$URL = "https://aka.ms/downloadazcopy-v10-windows"
New-Item -ItemType Directory -Path c:\AzCopyTest
$Destination = "c:\AzCopyTest\zzz.zip"
$WebClient = New-Object -TypeName System.Net.WebClient
$WebClient.DownloadFile($URL, $Destination).

$ExtractLocation = "c:\AzCopyTest"
$ExtractShell = New-Object -ComObject Shell.Application
$file = $ExtractShell.NameSpace($Destination).Items()
$ExtractShell.NameSpace($ExtractLocation).CopyHere($file)

How can I get into the folder after the extraction is done? FYI, I don't want to ls into it directly (or manually instead). I'd like to be able to list out the items in that directory and get the first directory. azcopy_windows_amd64_10.3.4 is what the directory called BTW. Now, when MS realease a new version (say 10.3.5), the directory will be renamed, and I do not want to go back in and manually change it. You get where I am going with this..

I know Get-ChildItem -Path ("$ExtractLocation") -Recurse will list the items in the directory and the sub-directories. But it does not server my purpose unfortunately.

Any help is greatly appreciated!

Anonymous Person
  • 1,437
  • 8
  • 26
  • 47

1 Answers1

0

I tried this: Get-ChildItem -Path $ExtractLocation -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object).Name and it worked.

FYI, at a later time, if MS decided to add another directory in the zipped file, you can simple do this:

Get-ChildItem -Path $ExtractLocation -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object).Name[<position_of_the_directory_starting_from_0]

Anonymous Person
  • 1,437
  • 8
  • 26
  • 47