1

I have downloaded log files from a GCP bucket, that I need to parse in a later task. But for now, the files are compressed and are over 40k .gz files. I tried few commands and none of them worked:

find . -prune -name '*.gz' -exec gunzip {} +

find . -type f -exec gunzip {} +

Any idea how I can unzip that number of files using the cmd, I am using Bash on Mac.

Antonio Petricca
  • 8,891
  • 5
  • 36
  • 74
toh19
  • 1,083
  • 10
  • 21

1 Answers1

0

An optimization you can do is to allow more than one file per gunzip:

find . -type f -name \*.gz -print0 | xargs -0 gunzip
  • This scans all directories from the current directory and below.
  • locating all regular files (-type f) with a .gz file name ending.
  • Piping the output to another command (xargs) null (\0) terminated.
  • The xargs reads null terminate input (-0).
  • The calling gunzip with as many file names as can fit on the command line.
James Risner
  • 5,451
  • 11
  • 25
  • 47