I would like to efficiently search through a few hundred log files for ~200 filenames.
I can easily do this using grep
's -f
directive and putting the needle(s) in a file.
However, there are a few problems:
- I'm interested in doing this efficiently, as in How to use grep efficiently?
- I want to know all the matches for each search term (i.e. filename) in all log files separately.
grep -f
would match as it finds needles in each file. - I would like to know when a filename is not matched anywhere.
2.7 i7 MBP w/ 16gb of ram
Using grep -ron -f needle *
gives me:
access_log-2013-01-01:88298:google
access_log-2013-01-01:88304:google
access_log-2013-01-01:88320:test
access_log-2013-01-01:88336:google
access_log-2013-01-02:396244:test
access_log-2013-01-02:396256:google
access_log-2013-01-02:396262:google
where needle
contains:
google
test
The problems here is the whole directory is searched for any match from needle
and the process is single-threaded so it takes forever. There's also no explicit information as to whether it fails to find a match.