I'm looking for a way to extract multiple tar files into their own folders with a single step in my pipeline.
I have a mono-repo pipeline that builds multiple Remix applications. Each application packages their deployment as a TGZ file. I have an artifact folder called bundles, that contains just the tgz files, each in its own subfolder.
ie: bundles \app1 ---\app1.tgz \app2 ---\app2.tgz
In a later stage, I need to extract these tgz files so that I can run some scans.
My pipeline has the following steps. The first pulls down the above folder into the job's working folder, bundles. The second is intended to extract the TGZ.
- task: DownloadPipelineArtifact@2
displayName: "Download Artifact: bundles"
inputs:
artifact: bundles
path: bundles
- task: ExtractFiles@1
displayName: Test Extract
condition: always()
inputs:
archiveFilePatterns: 'bundles/**/*.tgz'
destinationFolder: 'bundle_extracts'
cleanDestinationFolder: false
overwriteExistingFiles: false
I do not ever know which apps will have a tgz in this folder, so I cannot reference any by full name. Sometimes there will only be one tgz file. Other times it could be more, but each will be in its own folder.
When there is only one tgz, the above 2nd step works fine. But with multiple tgz files, i get errors because they both are extracting into the same destination folder, bundle_extracts.
So my ask is this... How do I build this so that each tgz extracts into its own directory tree?
Ideally, the bundle_extracts folder would look something like below...
bundle_extracts \app1 ---{ all the contents from app1.tgz} \app2 ---{ all the contents from app2.tgz}
I have tries to put wildcards in the destinationFolder argument, but that didn't do anything.
I have attempted bash and powershell scripts but never seem to get the command lines correct.