2

I have a folder with 200k pdf files and want to tar them. Which of the following solutions would be better? I suspect the command substitution might run into problems with the huge number of files because of command line length limitations.

Process Substitution

tar -cf out.tar -T <(compgen -G '*.pdf')

Command Substitution (Might break command length limitations?)

tar -cf out.tar `compgen -G '*.pdf'`
Roland
  • 7,525
  • 13
  • 61
  • 124
  • 1
    Why not using `xargs` and the append mode of `tar`? Something like `compgen -G '*.pdf' | xargs | tar --append -f out.tar`. – Renaud Pacalet Oct 19 '21 at 11:46
  • The first one if command line length is a potential issue, of course. – Shawn Oct 19 '21 at 11:49
  • @RenaudPacalet That would be another option, but process substitution seems simpler to me – Roland Oct 19 '21 at 11:59
  • 1
    @Roland Sure. But `xargs` was especially designed to solve the command line length limitations. It has some other nice properties like the NUL input separator, for instance, that you could want to use in case some of your files have spaces in their name. – Renaud Pacalet Oct 19 '21 at 12:26

1 Answers1

0

I tested.

Process Substitution woks, Command Substitution breaks due to argument list length.

I also tested with tar --append with both find and xargs. The results suggest that append is much slower.

Carlos Marx
  • 496
  • 1
  • 6
  • 13