0

I'm running some commands on multiple servers and I want them to all run concurrently,

foreach($clust['hosts'] as $hostStr) {
    list($host, $port) = Str::rsplit($hostStr,':',2,22);

    $ssh = new \Net_SSH2($host, $port);

    if(!$ssh->login($username,$key)) {
        throw new \Exception("Could not connect to $username@$host:$port");
    }

    $connections[] = $ssh;

}

foreach($connections as $i=>$ssh) {
    $ssh->exec('cd /path/to/my/project && hg up', function($str) {
        echo $str;
    });
    echo "right after ls $i\n";
}

But this always runs sequentially. Can I tell Net_SSH2 to be non-blocking?

mpen
  • 272,448
  • 266
  • 850
  • 1,236

1 Answers1

1

One thing you could probably do is to use $ssh->setTimeout(1) or $ssh->setTimeout(0.5) or something.

You could also probably do $ssh->write(...) and just not do $ssh->read().

neubert
  • 15,947
  • 24
  • 120
  • 212
  • Using `setTimeout` won't abort the command, it will just stop waiting for a response? – mpen Aug 01 '14 at 15:49
  • 1
    Yup - I believe that's correct! You should be able to verify by doing ps aux | grep command in a separate ssh window. – neubert Aug 01 '14 at 16:16