2

From time to time I have to append some text at the end of a bunch of files. I would normally find these files with find.

I've tried

find . -type f -name "test" -exec tail -n 2 /source.txt >> {} \;

This however results in writing the last two lines from /source.txt to a file named {} however many times a file was found matching the search criteria.

I guess I have to escape >> somehow but so far I wasn't successful.

Any help would be greatly appreciated.

Bart C
  • 1,509
  • 2
  • 16
  • 17
  • 1
    `-exec bash -c 'tail -n 2 /source.txt >> "$1"' bash {} \;`. Also, I haven't tested it, but storing the output of `tail -n 2` in some environment variable and writing that would probably be faster if you're worried about performance (could also play around with `+` and `tee` variants) – Reinstate Monica Please Aug 05 '15 at 10:44
  • 1
    I'd just pipe into xargs instead – 123 Aug 05 '15 at 10:50
  • The reason this doesn't work is the same as [this](http://stackoverflow.com/questions/62044/how-do-i-use-a-pipe-in-the-exec-parameter-for-a-find-command), but not exactly sure if that's considered a duplicate so switching my comment to an answer. – Reinstate Monica Please Aug 05 '15 at 11:12

2 Answers2

1

-exec only takes one command (with optional arguments) and you can't use any bash operators in it.

So you need to wrap it in a bash -c '...' block, which executes everything between '...' in a new bash shell.

find . -type f -name "test" -exec bash -c 'tail -n 2 /source.txt >> "$1"' bash {} \;

Note: Everything after '...' is passed as regular arguments, except they start at $0 instead of $1. So the bash after ' is used as a placeholder to match how you would expect arguments and error processing to work in a regular shell, i.e. $1 is the first argument and errors generally start with bash or something meaningful

If execution time is an issue, consider doing something like export variable="$(tail -n 2 /source.txt)" and using "$variable" in the -exec. This will also always write the same thing, unlike using tail in -exec, which could change if the file changes. Alternatively, you can use something like -exec ... + and pair it with tee to write to many files at once.

Reinstate Monica Please
  • 11,123
  • 3
  • 27
  • 48
  • It works! Thanks for the solution and explanation. It's usually not too many files but if that changes I'll come back to your comments. – Bart C Aug 05 '15 at 11:24
1

A more efficient alternative (assuming bash 4):

shopt -s globstar
to_augment=( **/test )
tail -n 2 /source.txt | tee -a "${to_augment[@]}" > /dev/null

First, you create an array with all the file names, using a simple pattern that should be equivalent to your call to find. Then, use tee to append the desired lines to all those files at once.

If you have more criteria for the find command, you can still use it; this version is not foolproof, as it assumes no filename contains a newline, but fixing that is best left to another question.

while read -r fname; do
    to_augment+=( "$fname" )
done < <(find ...)
chepner
  • 497,756
  • 71
  • 530
  • 681