2

My Bash-Foo is not strong. Right now I have something like

function update_project {
  for i in server-{1,2,3,4} ; do
    echo "Updating $i"
    ssh $i "git pull"
  done
}

The number of servers is growing every day, and since each update takes about 20 seconds, I'd like to do the requests concurrently. What's the best way to do this, while still being able to see the output (e.g. failed merges)?

linkedlinked
  • 446
  • 1
  • 3
  • 11

5 Answers5

2

func will allow you to send a command to an arbitrary number of machines, and let you watch the output.

Ignacio Vazquez-Abrams
  • 45,939
  • 6
  • 79
  • 84
2

Why are you trying to reinvent the wheel? Just use Capistrano to deploy your project code. It's designed for this exact purpose and runs the deploys in parallel across all of the configured machines.

rodjek
  • 3,327
  • 17
  • 14
1

There are several parallel ssh projects. Look for pssh.

Joel K
  • 5,853
  • 2
  • 30
  • 34
1

One of the other answers is the correct thing to do, but to directly answer your question, just add an ampersand at the end of your command:

ssh $i "git pull" &
Dennis Williamson
  • 62,149
  • 16
  • 116
  • 151
0

If you have GNU Parallel http://www.gnu.org/software/parallel/ installed you can do this:

parallel --slf hostfile --nonall --tag git pull

You can install GNU Parallel simply by:

wget http://git.savannah.gnu.org/cgit/parallel.git/plain/src/parallel
chmod 755 parallel
cp parallel sem

Watch the intro videos for GNU Parallel to learn more: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1

Ole Tange
  • 2,946
  • 6
  • 32
  • 47