2

Using GNU find, I can use the -maxdepth option to specify a specific depth to search for files. Unfortunately, my command needs to run on HP-UX, AIX, and Solaris as well which don't support the -maxdepth option.

I have found that I can run find /some/path/* -prune to get only files in a single folder, but I want to recurse down n levels, just like the -maxdepth argument allows. Can this be done in a cross platform way?

Edit: I found I can use the -path option to do a similar filter like so

find ./ ! -path "./*/**"

Unfortunately, AIX find does not support the -path option. I'm at least a little bit closer.

ben
  • 243
  • 1
  • 12
  • Is there any reason you don't just install the GNU find packages on all three operating systems? – Tripp Kinetics May 16 '14 at 18:44
  • This command needs no work on any arbitrary Linux, HP-UP, AIX, and Solaris machine. In production, I will not have access to these machines. I am willing to use some tool other than find if it is installed by default on these systems, but then there are other requirements as my true find command is longer than that in my question. – ben May 16 '14 at 19:16
  • I should add that I can use a different command on each system if necessary, but something universal is preferred. – ben May 16 '14 at 19:33
  • If you don't mind having `find` find all files in the directory tree and then postprocess its output, you can do something like `find ... | grep -v -E '/[^/]+/[^/]+/[^/]+/'` to filter out pathnames that are deeper than your desired depth. – Mark Plotnick May 20 '14 at 19:21
  • I believe that our requirement is intended to improve performance by not searching the entire tree. If I find that I can still use a filter after searching the full tree, I will use your technique. – ben May 21 '14 at 20:17
  • 1
    Here's another option, although it's a bit complicated. [This answer from unix stackexchange](http://unix.stackexchange.com/a/18356/49439) shows how to run `find` in a directory while not looking in any of its subdirectories. So you could run `find` to search for files at the top level of a directory, then run a slightly different `find` to enumerate all the directories at the top level of that directory. Then, run another similar pair of `find` commands in each of those directories, and so on for as many levels as you want. – Mark Plotnick May 27 '14 at 16:34
  • That certainly is more complicated, but would definitely work. It turns our our requirement to use maxdepth is pretty loose, so our solution ended up being to just drop support for it on the systems without that flag, but if we need to add it in the future, I will likely do your manual depth traversal suggestion. – ben May 28 '14 at 16:22

1 Answers1

0

This may not be the most performant solution but it should be quite portable. I tested it on Solaris in addition to OSX and Linux. In the essence, it is a recursive tree-walking depth-first using ls. Feel free to tweak and sanitize it to your needs. Hope, it works on AIX too.

#!/bin/bash

path="$(echo $1 | sed -e 's%/*$%%')"    # remove trailing spaces
maxDepth="$2"                           # maximum search depth
currDepth="$3"                          # current depth

[ -z $currDepth ] && currDepth=0        # initialize

[ $currDepth -lt $maxDepth ] && {       # are we allowed to go deeper
    echo "D: \"$path\""                 # show where we are
    IFS=$'\n'                           # split the "ls" output by newlines instead of spaces
    for entry in $(ls -F "$path"); done # scan directory
        [ -d "$path/$entry" ] && {      # recursively descent if it is a child directory
            $0 "$path/$entry" $maxDepth $((currDepth+1))
            continue
        }
        echo "F: \"$path/$entry\""      # show it if it is not a directory (symink, file, whatever)
    done
}
evolvah
  • 625
  • 4
  • 15