0

I need to execute jpeg-recompress on thousands of files. I've tried:

for image in *.jpg; do
    image2=${image%.*}.jpg;
    ./jpeg-recompress /path/to/images/$image /path/to/images/$image2;
    echo "Processing $image2 file...";
done

But fails to do so arguing that the: Maximum number of arguments exceeded I tried using find, something like:

find /path/to/images -iname “*.jpg” | while read image; do

(and the rest of the previous argument)

But it doesn't seem to do anything. I was wondering how could I take advantage of this program on multiple files?

jpeg-recompress image.jpg image.jpg

(Does replace the original, as intended)

  • The double-quotes in your `find` command look like fancy unicode quotes, which do not function as shell quotes (they'll get treated as part of the filename pattern to search for, not quotes *around* the pattern). Try it again with plain ASCII quotes. – Gordon Davisson Dec 23 '16 at 18:44

2 Answers2

0

Large file lists

There are many ways to get around this. The first that comes to mind is using find.

for image in `find /path/to/jpegs -type f -name *.jpg`
do
 something
done

There is also an xargs command that may prove useful.

Continuing to use a for loop can be advantageous; vs using find alone, as you can append a log file with the jpeg files you have processed, so you can skip them next time you run the command, should you desire to do so.

Aaron
  • 2,859
  • 2
  • 12
  • 30
0

You can do something simple with just a find:

find /path/to/images -name '*.jpg' -exec jpeg-recompress {} \;

Or if you have multiple cores you could get more fancy and write a loop that compressed multiple files in parallel

Unbeliever
  • 2,336
  • 1
  • 10
  • 19