82

I'm trying to construct a find command to process a bunch of files in a directory using two different executables. Unfortunately, -exec on find doesn't allow to use pipe or even \| because the shell interprets that character first.

Here is specifically what I'm trying to do (which doesn't work because pipe ends the find command):

find /path/to/jpgs -type f -exec jhead -v {} | grep 123 \; -print
Farvardin
  • 5,336
  • 5
  • 33
  • 54
hoyhoy
  • 6,281
  • 7
  • 38
  • 36

6 Answers6

90

Try this

find /path/to/jpgs -type f -exec sh -c 'jhead -v {} | grep 123' \; -print

Alternatively you could try to embed your exec statement inside a sh script and then do:

find -exec some_script {} \;
Damien Pollet
  • 6,488
  • 3
  • 27
  • 28
Martin Marconcini
  • 26,875
  • 19
  • 106
  • 144
15

A slightly different approach would be to use xargs:

find /path/to/jpgs -type f -print0 | xargs -0 jhead -v | grep 123

which I always found a bit easier to understand and to adapt (the -print0 and -0 arguments are necessary to cope with filenames containing blanks)

This might (not tested) be more effective than using -exec because it will pipe the list of files to xargs and xargs makes sure that the jhead commandline does not get too long.

Palmin
  • 2,731
  • 2
  • 19
  • 14
  • 4
    The problem with using xargs here is that I need the name of the file that matches. This command does find the matches, but I don't know which file matched. – hoyhoy Sep 15 '08 at 20:11
4

With -exec you can only run a single executable with some arguments, not arbitrary shell commands. To circumvent this, you can use sh -c '<shell command>'.

Do note that the use of -exec is quite inefficient. For each file that is found, the command has to be executed again. It would be more efficient if you can avoid this. (For example, by moving the grep outside the -exec or piping the results of find to xargs as suggested by Palmin.)

Community
  • 1
  • 1
mweerden
  • 13,619
  • 5
  • 32
  • 32
  • 1
    Another way to avoid the multiple process inefficiency in the general case is to use xargs. If you happen to need separate processes, you can use the -i option. I find xargs more in keeping with the Unix model. – Jon 'links in bio' Ericson Sep 17 '08 at 21:00
  • 1
    AOL on xargs use. mweerden, perhaps you should change your last paragraph by taking account of the xargs existence. Also note the -0 flag that exists in both `find` and `xargs`. – tzot Oct 11 '08 at 23:14
3

Using find command for this type of a task is maybe not the best alternative. I use the following command frequently to find files that contain the requested information:

for i in dist/*.jar; do echo ">> $i"; jar -tf "$i" | grep BeanException; done
Sparhawk
  • 1,581
  • 1
  • 19
  • 29
Dimitar
  • 41
  • 1
1

As this outputs a list would you not :

find /path/to/jpgs -type f -exec jhead -v {} \; | grep 123

or

find /path/to/jpgs -type f -print -exec jhead -v {} \; | grep 123

Put your grep on the results of the find -exec.

Xetius
  • 44,755
  • 24
  • 88
  • 123
  • That doesn't work because I need the -print to work. If grep returns a success, then find prints the file name, otherwise it doesn't. – hoyhoy Sep 15 '08 at 09:54
0

There is kind of another way you can do it but it is also pretty ghetto.

Using the shell option extquote you can do something similar to this in order to make find exec stuff and then pipe it to sh.

root@ifrit findtest # find -type f -exec echo ls $"|" cat \;|sh
filename


root@ifrit findtest # find -type f -exec echo ls $"|" cat $"|" xargs cat\;|sh
h

I just figured I'd add that because at least the way i visualized it, it was closer to the OP's original question of using pipes within exec.

linuxgeek
  • 57
  • 8