5

I have a find script that automatically opens a file if just one file is found. The way I currently handle it is doing a word count on the number of lines of the search results. Is there an easier way to do this?

if [ "$( cat "$temp" | wc -l | xargs echo )" == "1" ]; then
    edit `cat "$temp"`
fi

EDITED - here is the context of the whole script.

term="$1"
temp=".aafind.txt"

find src sql common -iname "*$term*" | grep -v 'src/.*lib'  >> "$temp"

if [ ! -s "$temp" ]; then
    echo "ø - including lib..." 1>&2
    find src sql common -iname "*$term*"  >> "$temp"
fi


if [ "$( cat "$temp" | wc -l | xargs echo )" == "1" ]; then
    # just open it in an editor
    edit `cat "$temp"`
else
    # format output
    term_regex=`echo "$term" | sed "s%\*%[^/]*%g" | sed "s%\?%[^/]%g" `
    cat "$temp" | sed -E 's%//+%/%' | grep --color -E -i "$term_regex|$"
fi

rm "$temp"
redolent
  • 4,159
  • 5
  • 37
  • 47
  • 1
    Is `$temp` a real file containing filenames or a variable with one or more filenames? – grebneke Jan 29 '14 at 22:52
  • 2
    Why do you need `xargs echo`? Are you trying to get rid of the spaces around the output of `wc`? Just get rid of the double quotes around `$(...)`. – Barmar Jan 29 '14 at 22:56
  • Can you post more of your script? It's not at all clear what `$temp` is or how you go about finding the files. – Reinstate Monica Please Jan 29 '14 at 23:02
  • Edited. As you can see, I use the temp file to format the output. – redolent Jan 29 '14 at 23:23
  • 1
    `find ... | grep -v` is silly -- you can just tell find to prune things you don't want. `find src sql common -path 'src/*lib' -prune -o -iname "*$term*" -print` doesn't need any grepping. – Charles Duffy Jan 30 '14 at 01:13

6 Answers6

7

Unless I'm misunderstanding, the variable $temp contains one or more filenames, one per line, and if there is only one filename it should be edited?

[ $(wc -l <<< "$temp") = "1" ] && edit "$temp"

If $temp is a file containing filenames:

[ $(wc -l < "$temp") = "1" ] && edit "$(cat "$temp")"
grebneke
  • 4,414
  • 17
  • 24
  • Consider making the last line `... && edit "$(<"$temp")"` for bash, or `... && edit "$(cat "$temp")"` otherwise -- avoiding the subshell makes things more efficient (given a shell with support), and adding the missing quotes makes things more correct (for filenames containing newlines, wildcard characters, etc). That said, +1. – Charles Duffy Jan 30 '14 at 17:11
  • @CharlesDuffy - thanks, double-quotes added. Every time you post an answer or comment, I learn something new, good job! The `$(<"$temp")` construct makes me a little uneasy, links to good explanation? – grebneke Jan 30 '14 at 17:34
  • From the "command substitution" section of the bash man page: `The command substitution $(cat file) can be replaced by the equivalent but faster $(< file).` Roughly, it's a formulation which directly reads from the file without involving a subshell and subprocess involved in using cat in a command substitution, but at the penalty of losing compatibility with POSIX sh. – Charles Duffy Jan 30 '14 at 19:47
3

Several of the results here will read through an entire file, whereas one can stop and have an answer after one line and one character:

if { IFS='' read -r result && ! read -n 1 _; } <file; then
  echo "Exactly one line: $result"
else
  echo "Either no valid content at all, or more than one line"
fi

For safely reading from find, if you have GNU find and bash as your shell, replace <file with < <(find ...) in the above. Even better, in that case, is to use NUL-delimited names, such that filenames with newlines (yes, they're legal) don't trip you up:

if { IFS='' read -r -d '' result && ! read -r -d '' -n 1 _; } \
        < <(find ... -print0); then
    printf 'Exactly one file: %q\n' "$result"
else
    echo "Either no results, or more than one"
fi
Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • your solution is pretty neat, certainly more efficient, but it's not answering the OP's question, which is whether there is a more **easy** way to do it. Your solution is nothing but easy! – zmo Jan 29 '14 at 23:06
  • 1
    @zmo A "solution" that doesn't cover the corner cases isn't a solution at all, it's a bug waiting to happen. Using `wc -l` to count filenames definitely doesn't cover the corner cases -- a single filename with three newlines in its name would be counted as four separate files! So, no, this isn't easy, but the latter, bash+GNU version is also the only correct solution I've seen given here at all. – Charles Duffy Jan 29 '14 at 23:06
  • well, the OP wants two cases : 1 line, or not 1 line (which includes 0 lines, 42 lines, no file ...). Good point about the edge case with `\n` in the file name, did not think about that one :-) but what if there is `EOF` in the filename ? – zmo Jan 29 '14 at 23:09
  • @zmo the OP specified that what's actually being counted are filenames. POSIX system filenames can't be safely stored in newline-separated streams at all (because you can't tell the difference between one filename containing three newlines and four separate filenames), so the entire question is founded on an invalid premise. – Charles Duffy Jan 29 '14 at 23:10
  • @zmo ...and by the way, yes, I already cover both the zero and more-than-one cases identically; I'm not clear on why you're reiterating that part of the requirement. – Charles Duffy Jan 29 '14 at 23:13
  • 1
    i like this answer because it does avoid reading the whole file, which no wc based solution could accomplish. Should also be possible to do with something like grep -m 2 . and a grep solution will not need to mess with IFS – frankc Jan 30 '14 at 00:43
2

If you want to test whether the file is empty or not, test -s does that.

if [ -s "$temp" ]; then
    edit `cat "$temp"`
fi

(A non-empty file by definition contains at least one line. You should find that wc -l agrees.)

If you genuinely want a line count of exactly one, then yes, it can be simplified substantially;

if [ $( wc -l <"$temp" ) = 1 ]; then
    edit `cat "$temp"`
fi
tripleee
  • 175,061
  • 34
  • 275
  • 318
  • I am guessing because your original post(s) didn't really answer the question, skipping the 'just one file' bit entirely. The downvote was there before your edits :) – bryn Jan 29 '14 at 22:59
  • 1
    +1 -- least-silly non-pedantic answer (though I'm a little sketchy on using unquoted expansion to get the name list -- filenames with spaces are a thing that exist, after all, as are glob characters)... actually, damnit, talked myself out of that upvote. – Charles Duffy Jan 29 '14 at 23:17
2

Well, given that you are storing these results in the file $temp this is a little easier:

[ "$( wc -l < $temp )" -eq 1 ] && edit "$( cat $temp )"

Instead of 'cat $temp' you can do '< $temp', but it might take away some readability if you are not very familiar with redirection 8)

bryn
  • 3,155
  • 1
  • 16
  • 15
1

You can use arrays:

 x=($(find . -type f))
 [ "${#x[*]}" -eq 1 ] && echo "just one || echo "many"

But you might have problems in case of filenames with whitespace, etc.

Still, something like this would be a native way

Jakub Kotowski
  • 7,411
  • 29
  • 38
  • See entry #1 in http://mywiki.wooledge.org/BashPitfalls -- string-splitting find's output like this is dangerous. (Have filenames containing whitespace? Wildcards? Whitespace surrounding wildcards?) – Charles Duffy Jan 29 '14 at 22:53
  • Yes, I was just about to add a comment about it but got stuck thinking about solving it with null terminated filenames – Jakub Kotowski Jan 29 '14 at 22:58
1

no this is the way, though you're making it over-complicated:

if [ "`wc -l $temp | cut -d' ' -f1`" = "1" ]; then 
    edit "$temp";
fi

what's complicating it is:

  • useless use of cat,
  • unuseful use of xargs

and I'm not sure if you really want the editcat $temp`` which is editing the file at the content of $temp

zmo
  • 24,463
  • 4
  • 54
  • 90
  • 1
    Redirecting `wc -l <"$temp"`would do away with the `cut` also. – tripleee Jan 29 '14 at 22:55
  • 1
    Using `==` inside of `[ ]` isn't valid POSIX -- POSIX test uses a single `=` for comparison. – Charles Duffy Jan 29 '14 at 22:58
  • @tripleee good point, but I still prefer it that way. I think it's more readable that way ; @charles-duffy, good point, though I'm always using `==` in my scripts to not get a bad habit when I code in C (and `==` is totally ok in zsh/bash) – zmo Jan 29 '14 at 23:02
  • I mean, the OP asked "Is there an **easier** way to do this?, not whether there is a more **efficient** way to do this. His solution is pretty natural, with a few useless stuff, and my solution is the way I'd do it. Readable and simple. – zmo Jan 29 '14 at 23:05
  • 1
    @zmo better to be in the habit of using `[[ ]]` rather than `[ ]` then -- that way you get more benefits from trading on portability. – Charles Duffy Jan 29 '14 at 23:12
  • @zmo 'easy' and 'readable' are different things, and `wc -l < "$temp"` is arguably more readable than piping to cut. If you can use a `|`, why discriminate against `<`? ;) – bryn Jan 29 '14 at 23:16
  • well, I'm not going to argue over that, because none of `wc -l < "$temp"` or using `wc` is really better, it's only a question of taste. My only point is that I gave the way I'd do it, and neither the most efficient or shortest way to do it! – zmo Jan 29 '14 at 23:23