225

How do I write a bash script that goes through each directory inside a parent_directory and executes a command in each directory.

The directory structure is as follows:

parent_directory (name could be anything - doesnt follow a pattern)

  • 001 (directory names follow this pattern)
    • 0001.txt (filenames follow this pattern)
    • 0002.txt
    • 0003.txt
  • 002
    • 0001.txt
    • 0002.txt
    • 0003.txt
    • 0004.txt
  • 003
    • 0001.txt

the number of directories is unknown.

Community
  • 1
  • 1
Van de Graff
  • 5,043
  • 13
  • 39
  • 41

12 Answers12

250

This answer posted by Todd helped me.

find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c "cd '{}' && pwd" \;

The \( ! -name . \) avoids executing the command in current directory.

Community
  • 1
  • 1
Christian Vielma
  • 15,263
  • 12
  • 53
  • 60
  • 5
    And if you only want to run the command in certain folders: `find FOLDER* -maxdepth 0 -type d \( ! -name . \) -exec bash -c "cd '{}' && pwd" \;` – Martin K Dec 03 '15 at 10:40
  • 4
    I had to do `-exec bash -c 'cd "$0" && pwd' {} \;` since some directories contained single quotes and some double quotes. – Charlie Gorichanaz Aug 18 '16 at 08:46
  • 28
    You also could omit current directory by adding `-mindepth 1` – mdziob Jan 09 '17 at 16:05
  • How to change this to a function accept a command like "git remote get-url origin` instead of `pwd` directly? – roachsinai Feb 14 '19 at 03:51
  • `disd(){find . -maxdepth 1 -type d \( ! -name . \) -exec bash -c 'cd "{}" && "$@"' \;}` not work. – roachsinai Feb 14 '19 at 03:55
  • For me the command didn't work as it is. Taking cue from the comments by CharlieGorichanaz and roachsinai, a subtle change worked - replace " by ' and vice-versa – bbv Mar 23 '21 at 08:34
  • If you're in a git repo and you don't want to break everything remember to skip hidden folders like `find . -not -path '*/.*' -type f -exec sed -i 's/\r//' {} \;` – tuxErrante Feb 27 '23 at 13:59
134

You can do the following, when your current directory is parent_directory:

for d in [0-9][0-9][0-9]
do
    ( cd "$d" && your-command-here )
done

The ( and ) create a subshell, so the current directory isn't changed in the main script.

Socowi
  • 25,550
  • 3
  • 32
  • 54
Mark Longair
  • 446,582
  • 72
  • 411
  • 327
  • 6
    It doesn't matter here because the wildcard doesn't match anything else than numbers, but in the general case, you basically always want to put the directory name in double quotes inside the loop. `cd "$d"` would be better in that it transfers to situations where the wildcard does match files whose names contain whitespace and/or shell metacharacters. – tripleee Apr 17 '15 at 09:53
  • 1
    (All of the answers here seem to have the same flaw, but it matters the most in the top-voted answer.) – tripleee Apr 17 '15 at 09:54
  • 1
    Also, the vast majority of commands don't actually care in which directory you execute them. `for f in foo bar; do cat "$f"/*.txt; done >output` is functionally equivalent to `cat foo/*.txt bar/*.txt >output`. However, `ls` is one command that does produce slightly different output depending on how you pass it arguments; similarly, a `grep` or `wc` which outputs a relative file name will be different if you run it from a subdirectory (but often, you want to avoid going into a subdirectory precisely for that reason). – tripleee Apr 17 '15 at 09:56
  • Thanks for the answer I used it like so `for d in ./packages/*-spa/; do (cd "$d" && cp test.env .env); done` – Deepesh Nair Mar 30 '22 at 12:24
  • A very similar problem, If I have a bash script in parent directory, can I run it using this formalism? I have a bash script that needs to run in each subdirectory. – Prasanta Bandyopadhyay Jun 01 '23 at 15:26
71

You can achieve this by piping and then using xargs. The catch is you need to use the -I flag which will replace the substring in your bash command with the substring passed by each of the xargs.

ls -d */ | xargs -I {} bash -c "cd '{}' && pwd"

You may want to replace pwd with whatever command you want to execute in each directory.

Piyush Singh
  • 2,736
  • 10
  • 26
  • 6
    I don't know why this has not been upvoted, but the simplest script I found so far. Thanks – Linvi Feb 18 '19 at 10:13
  • 1
    This one is the best one as any command you run with xargs with generate an exit code and follow up actions can be taken from the value of '$?'. Thanks for this. – Biplob Biswas May 19 '22 at 14:56
63

If you're using GNU find, you can try -execdir parameter, e.g.:

find . -type d -execdir realpath "{}" ';'

or (as per @gniourf_gniourf comment):

find . -type d -execdir sh -c 'printf "%s/%s\n" "$PWD" "$0"' {} \;

Note: You can use ${0#./} instead of $0 to fix ./ in the front.

or more practical example:

find . -name .git -type d -execdir git pull -v ';'

If you want to include the current directory, it's even simpler by using -exec:

find . -type d -exec sh -c 'cd -P -- "{}" && pwd -P' \;

or using xargs:

find . -type d -print0 | xargs -0 -L1 sh -c 'cd "$0" && pwd && echo Do stuff'

Or similar example suggested by @gniourf_gniourf:

find . -type d -print0 | while IFS= read -r -d '' file; do
# ...
done

The above examples support directories with spaces in their name.


Or by assigning into bash array:

dirs=($(find . -type d))
for dir in "${dirs[@]}"; do
  cd "$dir"
  echo $PWD
done

Change . to your specific folder name. If you don't need to run recursively, you can use: dirs=(*) instead. The above example doesn't support directories with spaces in the name.

So as @gniourf_gniourf suggested, the only proper way to put the output of find in an array without using an explicit loop will be available in Bash 4.4 with:

mapfile -t -d '' dirs < <(find . -type d -print0)

Or not a recommended way (which involves parsing of ls):

ls -d */ | awk '{print $NF}' | xargs -n1 sh -c 'cd $0 && pwd && echo Do stuff'

The above example would ignore the current dir (as requested by OP), but it'll break on names with the spaces.

See also:

Community
  • 1
  • 1
kenorb
  • 155,785
  • 88
  • 678
  • 743
  • 1
    The only proper way to put the output of `find` in an array without using an explicit loop will be available in Bash 4.4 with `mapfile -t -d '' dirs < <(find . -type d -print0)`. Until then, your only option is to use a loop (or defer the operation to `find`). – gniourf_gniourf Oct 04 '15 at 14:03
  • 1
    In fact, you don't really need the `dirs` array; you could loop on `find`'s output (safely) like so: `find . -type d -print0 | while IFS= read -r -d '' file; do ...; done`. – gniourf_gniourf Oct 04 '15 at 14:04
  • @gniourf_gniourf I'm visual and always trying to find/make the things which look simple. E.g. I've [this script](https://github.com/EA31337/EA31337-Sets/blob/master/_scripts/run_backtests.sh) and by using bash array is easy and readable for me (by separating the logic into smaller pieces, instead of using pipes). Introducing additional pipes, `IFS`, `read`, it gives the impression of additional complexity. I'll have to think about it, maybe there is some easier way of doing it. – kenorb Oct 04 '15 at 14:11
  • If by _easier_ you mean _broken_ then yes, there are easier ways of doing. Now don't worry, all these `IFS` and `read` (as you say) become a second nature as you get used to them… they are part of the canonical and idiomatic ways of writing scripts. – gniourf_gniourf Oct 04 '15 at 14:20
  • By the way, your first command `find . -type d -execdir echo $(pwd)/{} ';'` doesn't do what you want (the `$(pwd)` is expanded before `find` is even executed)… – gniourf_gniourf Oct 04 '15 at 14:21
  • @gniourf_gniourf When I've tried to run `pwd` before it's expanded: `find . -type d -execdir pwd ';'`, it doesn't work as expected as well (it's just showing the same dir), despite `find` returns different dirs. – kenorb Oct 04 '15 at 14:34
26

If the toplevel folder is known you can just write something like this:

for dir in `ls $YOUR_TOP_LEVEL_FOLDER`;
do
    for subdir in `ls $YOUR_TOP_LEVEL_FOLDER/$dir`;
    do
      $(PLAY AS MUCH AS YOU WANT);
    done
done

On the $(PLAY AS MUCH AS YOU WANT); you can put as much code as you want.

Note that I didn't "cd" on any directory.

Cheers,

gforcada
  • 2,498
  • 17
  • 26
  • 3
    There are a couple of instances of ["Useless Use of `ls`"](http://partmaps.org/era/unix/award.html#ls) here - you could just do `for dir in $YOUR_TOP_LEVEL_FOLDER/*` instead. – Mark Longair Sep 19 '11 at 11:40
  • 1
    Well, maybe it's useless in a general sense but it allows to do filtering directly in the ls (i.e all directories ended with .tmp). That's why I used the `ls $dir` expression – gforcada Sep 19 '11 at 13:39
  • Will `for subdir in 'ls $YOUR_TOP_LEVEL_FOLDER/$dir';` traverse multiple directories nesting like `parent/child1/child1_1/child1_1_1/`? Or just one directory deep into the parent? – ruslaniv Feb 03 '21 at 06:21
23
for dir in PARENT/*
do
  test -d "$dir" || continue
  # Do something with $dir...
done
Idelic
  • 14,976
  • 5
  • 35
  • 40
7

While one liners are good for quick and dirty usage, I prefer below more verbose version for writing scripts. This is the template I use which takes care of many edge cases and allows you to write more complex code to execute on a folder. You can write your bash code in the function dir_command. Below, dir_coomand implements tagging each repository in git as an example. Rest of the script calls dir_command for each folder in directory. The example of iterating through only given set of folder is also include.

#!/bin/bash

#Use set -x if you want to echo each command while getting executed
#set -x

#Save current directory so we can restore it later
cur=$PWD
#Save command line arguments so functions can access it
args=("$@")

#Put your code in this function
#To access command line arguments use syntax ${args[1]} etc
function dir_command {
    #This example command implements doing git status for folder
    cd $1
    echo "$(tput setaf 2)$1$(tput sgr 0)"
    git tag -a ${args[0]} -m "${args[1]}"
    git push --tags
    cd ..
}

#This loop will go to each immediate child and execute dir_command
find . -maxdepth 1 -type d \( ! -name . \) | while read dir; do
   dir_command "$dir/"
done

#This example loop only loops through give set of folders    
declare -a dirs=("dir1" "dir2" "dir3")
for dir in "${dirs[@]}"; do
    dir_command "$dir/"
done

#Restore the folder
cd "$cur"
Shital Shah
  • 63,284
  • 17
  • 238
  • 185
5

I don't get the point with the formating of the file, since you only want to iterate through folders... Are you looking for something like this?

cd parent
find . -type d | while read d; do
   ls $d/
done
Aif
  • 11,015
  • 1
  • 30
  • 44
3

You could run sequence of commands in each folder in 1 line like:

for d in PARENT_FOLDER/*; do (cd "$d" && tar -cvzf $d.tar.gz *.*)); done
user1859675
  • 201
  • 2
  • 5
3

you can use

find .

to search all files/dirs in the current directory recurive

Than you can pipe the output the xargs command like so

find . | xargs 'command here'
Dan Bizdadea
  • 1,292
  • 8
  • 15
2

In the below code you are able to use a pattern instead of * to filter unwanted files.

Modify -mindepth and maxdepth 1 to a higher number in order to dive deeper. Value of 1 will go only 1 level deep. Remove these min and max depth arguments completely to go infinitely deep.

Note: Only use the below code when you know there are no spaces in the folder names. See here for details.

#!/bin/bash
for folder in $(find . -mindepth 1 -maxdepth 1 -type d \( -name "*" \) );
do
  cd "$folder" || exit

  echo "Folder: $folder"

  cd -
done
Freddie
  • 908
  • 1
  • 12
  • 24
Hutch
  • 167
  • 2
  • 8
0
for p in [0-9][0-9][0-9];do
    (
        cd $p
        for f in [0-9][0-9][0-9][0-9]*.txt;do
            ls $f; # Your operands
        done
    )
done
Chris Maes
  • 35,025
  • 12
  • 111
  • 136
Fedir RYKHTIK
  • 9,844
  • 6
  • 58
  • 68