1

I'm new to python and I am trying to accomplish a multithreaded ssh session with paramiko.

def worker:
     ssh = paramiko.SSHClient()
     ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
     ssh.connect(host, username='root', password='password')
     stdin, stdout, stderr = ssh.exec_command(cmd)
     stdin.flush()
-->  pid = stdout.readline()


def main:
   f = open (options.host_file)
   ips = f.read().splitlines()
   for line in ips:
      user,host = line.split("@")
      my_thread = threading.Thread(target=worker ,args=(i,)
      threads.append(my_thread)
      my_thread.setDaemon(True)
      my_thread.start()
      my_thread.join()

      is_running = my_thread.is_alive()
      print "Value of IS RUNNING is: ",is_running,"\r"

What I have observed is that if I have pid = stdout.readline() in the worker, the thread runs the ssh and waits for each job before starting the next one, if I exclude this line all thread(s) run at the same time (which is what I would like) and the ssh exits.

I would like to set up a threaded ssh session to different hosts and monitor the threads until the program I am starting with ssh is finished. Since the threads will optimally run for different times, I'd like to restart the ones that end with new parameters.

Also isAlive always returns False for me.

Any assistance is greatly appreciated. Is this possible, or do I have to monitor the PID of the process started on the remote hosts?

user3157494
  • 11
  • 1
  • 2
  • Welcome to StackOverflow and to Python! The program above has syntax errors, is this a reduced version of the program you're working with? – Kyle Kelley Jan 03 '14 at 14:30

1 Answers1

0

Have you considered Fabric?

It lets you run commands in parallel across machines, over SSH using your keys as necessary.

Here's an example fabfile.py:

import fabric.api
from fabric.api import parallel
from fabric.api import env, run

env.user = 'root'
env.hosts = [ip1, ip2]

@parallel
def apt_update():
    '''
    Runs apt-get update and upgrade
    '''
    run('apt-get -y update')
    run('apt-get -y upgrade')

To run, you simply use fab apt_update.

In your use case, it looks like you need user and host defined dynamically based on the options file. env.user and env.hosts can be set dynamically as needed (though I haven't messed with setting env.user per host, while running parallel tasks).

Kyle Kelley
  • 13,804
  • 8
  • 49
  • 78
  • If I use fabric and the cmds finish at different times is there a way to catch the ended process using fabric? and restart the proc? – user3157494 Jan 03 '14 at 15:42
  • Do you need these processes to just keep continuing on running? – Kyle Kelley Jan 03 '14 at 15:44
  • Sort of, they will need to be restarted with random parameter run lengths, meaning that they will end at different times and *hopefully restarted with a new randomized run length – user3157494 Jan 03 '14 at 15:52
  • It might be worth it to put these under [supervision by supervisord](http://supervisord.org/), so you don't have have to keep a connection open and keep your tasks running. You'll have to set something up to send in (or generate) your randomized run lengths. – Kyle Kelley Jan 03 '14 at 17:43