0

Normally to search and replace a string in a selection of files, I would find with:

grep -rl 'pattern' .

This works fine unless you have a LOT of matches.

Anyway, I was looking around and found a suggestion of using find instead:

find /root/path -type f -exec grep -l 'FIND' {} +

This works fine, but when I try and pass it into Perl to do the replacement, I get the error still:

perl -p -i -e 's|FIND|REPLACE|g' `find /root/path -type f -exec grep -l 'FIND' {} +`

-bash: /usr/local/bin/perl: Argument list too long

Is there any way around this?

Andrew Newby
  • 1,102
  • 2
  • 25
  • 58
  • 1
    Redirect to a file, then read that file line by line calling your perl command/script with it – ivanivan Jul 13 '17 at 13:56
  • 1
    Type `man xargs` into your terminal window – Jenny D Jul 13 '17 at 13:56
  • @ivanivan ah ok, so there isn't a one liner you can do? – Andrew Newby Jul 13 '17 at 13:57
  • @JennyD thanks - not sure how that helps though? – Andrew Newby Jul 13 '17 at 13:58
  • Not really... MOST programs have a max of 255 arguments, this is the limit you are hitting.... – ivanivan Jul 13 '17 at 14:00
  • @ivanivan mmm ok, although I'm sure I've gone well over that limit before (as I've run greps on thousands of files). Anyway, maybe I'll give it a go like you suggested with a .cgi script that reads the piped values, and then do the regexp replacement on it manually – Andrew Newby Jul 13 '17 at 14:01
  • 1
    Not sure if this affects the set of commands you are running but I've read somewhere that using `+` in grep add as many files as possible as args to it at once while using `\;` executes grep every time a match is found. – Bryan CS Jul 13 '17 at 14:04
  • @BryanCS - thanks for the suggestion. Unfortunately it didn't work (same error about the argument list being too long) – Andrew Newby Jul 13 '17 at 14:09
  • @ivanivan thanks for the tip. I've written a little script (below) and shared it so hopefully, others can benefit as well. It's pretty simple, and does the job pretty efficiently :) – Andrew Newby Jul 13 '17 at 14:24
  • why not use just `sed`? – alexus Jul 13 '17 at 14:25
  • @alexus, won't that just have the same problem though? ;) – Andrew Newby Jul 13 '17 at 14:28
  • 1
    I dont think so... sed is *stream* editor, I think it'd run much faster as well. – alexus Jul 13 '17 at 16:15
  • @alexus - ah ok, interesting! I will give it a go when I next need to do a large grep :) (I've already done the stuff I needed to, with the script below, but its good to have a cleaner option for the future) – Andrew Newby Jul 14 '17 at 05:48
  • BTW, I assume I would run it in a similar way? `sed -i 's/$find/$replace/g' \`find $path -type f -exec grep -l '$find' {} +\`` – Andrew Newby Jul 14 '17 at 05:53
  • @alexus - I gave it a go, but still get an error: `-bash: /bin/sed: Argument list too long` – Andrew Newby Jul 14 '17 at 07:47

1 Answers1

1

As suggested by people, I needed to write a little script to do the magic for me :)

For me, I have a LOT of files - so to help speed things up (and make the find a bit nicer on the server), I have split mine into multiple processes for the larger folders. So the script is:

#!/usr/bin/perl

my $find = q|string-to-find|;
my $replace = q|replace-with|;

my @paths = split /\n/, q|/home/user/folder1
/home/user/folder2
/home/user/folder3
/home/user/folder4|;

my $debug = 1;

foreach my $path (@paths) {
    my @files = split /\n/, `find $path -type f -exec grep -l '$find' {} +`;

    foreach (@files) {
        chomp;
        if (-f $_) {
            print qq|Doing $_\n| if $debug > 0;
            `sed -i 's/$find/$replace/g' $_`
        }

    }

}

Then just run it from SSH with:

perl script-name.cgi
Andrew Newby
  • 1,102
  • 2
  • 25
  • 58