-1

I'm working on a script that connects several "client" computers in a "server" computer, which then uses those clients to process several files, using FTP (pyftplib and pyftpdlib) for transfering files and results.

The script works by creating 3 folders on the server: Files, Processing and Results. The clients then connect to the server by FTP, access the "Files" folder, get the file for processing, then transfer it to the "Processing" folder while it is processing it. Then, when it finishes processing, the client delete the file from the processing folder and copies the results to the "Results" folder.

This is working correctly, both on the server and the client side. The problem i'm having is that, if one of the clients disconnects midway without generating an error (PC is disconnected, power outage), the server will threat this as if the client is still processing the file, and the file will stay in the "Processing" folder. What i want is a error checking function that, when this happens, the file on the "Processing" folder will return to the "Files" folder.

Here is the Server FTP Code

def main():
    authorizer = DummyAuthorizer()
    authorizer.add_user('client', 'password', '.', perm='elradfmwM')
    authorizer.add_anonymous(os.getcwd())

    handler = FTPHandler
    handler.authorizer = authorizer
    handler.banner = "FTP Server."
    address = ('', port)
    server = FTPServer(address, handler)
    server.max_cons = 256
    server.max_cons_per_ip = 50
    server.serve_forever()


if __name__ == '__main__':
    main()

And here is the Client FTP code:

while True:
    ftp = ftplib.FTP()
    ftp.connect(arguments.host_ip, arguments.host_port)
    ftp.login("client", "password")
    print ftp.getwelcome()
    ftp.retrlines('LIST')
    ftp.retrbinary('RETR Output.txt', open('Output.txt', 'wb').write)
    ftp.retrbinary('RETR dicionario.json', open('dicionario.json', 'wb').write)
    with open('dicionario.json') as json_file:
        json_data = json.load(json_file)
    receptor_file = json_data['--receptor']
    print 'Retrieving receptor file ' + receptor_file
    ftp.retrbinary('RETR ' + receptor_file, open(receptor_file, 'wb').write)
    ftp.cwd('Files')
    ftp.retrlines('LIST')
    filename = ftp.nlst()[0]
    print 'Getting ' + filename
    ftp.retrbinary('RETR ' + filename, open(filename, 'wb').write)
    with open("Output.txt", "a") as input_file:
        input_file.write('ligand = %s' %filename)
        input_file.close()
    ftp.delete(filename)
    ftp.cwd('../Processing')
    ftp.storbinary('STOR ' + filename, open(filename, 'rb'))
    ftp.quit()

    print "Processing"
    return_code = subprocess.call(calls the program for processing files)
    if return_code == 0:
        print """Done!"""
        ftp.connect(arguments.host_ip, arguments.host_port)
        ftp.login("client", "password")
        ftp.cwd('Results')
        ftp.storbinary('STOR ' + os.path.splitext(filename)[0] + '_out.pdbqt', open (os.path.splitext(filename)[0] + '_out.pdbqt'))
        ftp.cwd('../Processing')
        ftp.delete(filename)


        ftp.quit()
    else:
        print """Something is technically wrong..."""
        ftp.connect(arguments.host_ip, arguments.host_port)
        ftp.login("client", "password")
        ftp.cwd('Files')
        ftp.storbinary('STOR ' + filename, open(filename, 'rb'))
        ftp.cwd('../Processing')
        ftp.delete(filename)
        ftp.quit()

Thanks for the help!

Deufo
  • 9
  • 3

1 Answers1

0

So, after half a month fiddling with this code, i finally made it work when a client cancels the connection

First i had to make a way for the server to identify each client. Instead of making them login with only one user, i created specific users for each connection, with 2 different functions:

def handler_generation(size=9, chars=string.ascii_uppercase + string.digits):
    return ''.join(random.choice(chars) for i in range (size))

This generates a 9 character login and password

Then i created a custom handler in pyftpdlib, and used the on_login function:

class MyHandler(FTPHandler):
    def on_login(self, username):
    if username == "client":
        user_login = handler_generation()
        user_password = handler_generation()
        global authorizer
        authorizer.add_user(user_login, user_password, '.', perm='elradfmwM')
        credentials = open("Credentials.txt",'w')
        credentials.write(user_login)
        credentials.write("\n")
        credentials.write(user_password)
        credentials.close()
    else:
        pass

So, when the Client connects with the "client" login, the server generates a 9 character login and password, and sends it to the client in the "Credentials.txt" file. In the client-side, it would do this:

ftp.login("client", "password")
    ftp.retrbinary('RETR Credentials.txt', open('Credentials.txt', 'wb').write)
    ftp.quit()
    with open('Credentials.txt') as credential_file:
        lines = credential_file.readlines()
        credential_login = lines[0].split("\n")[0]
        credential_password = lines[1].split("\n")[0]
    ftp.connect(arguments.host_ip, arguments.host_port)
    ftp.login(credential_login, credential_password)

So now the clients all connect with their own specific login. On the client side, i made it so that for each task that was completed, the client would send a file named for their specific login. I also made the client append their own login name in the file they were processing to make it easy for the server to find the file:

ftp.rename(filename, credential_login + filename)

Then, i used another function of the handler class, the on_disconnect:

def on_disconnect(self):
    if self.username == "client":
        pass
    else:
        if os.path.isfile(self.username):
            pass
        else:
            for fname in os.listdir("Processing"):
                if fname.startswith(self.username):
                    shutil.move("Processing/" + fname, "Files")
                    os.rename("Files/" + fname, "Files/" + fname[9::])

    print self.remote_ip, self.remote_port,self.username, "disconnected"
    pass

Now, whenever a client disconnects, the server searches the folder to check if the client sent the handler file. If it's not there, the server will move the file to the "Files" folder, which is the folder where the Files that are yet to be processed are.

To make a failed client disconnect from the server without sending a quit command, i used the timeout function from pyftpdlib. To make sure that an active client would not accidentally timeout, i implemented a thread in the client, that would do something with the server each N seconds:

class perpetualTimer():

def __init__(self,t,hFunction):
    self.t=t
    self.hFunction = hFunction
    self.thread = Timer(self.t,self.handle_function)

def handle_function(self):
    self.hFunction()
    self.thread = Timer(self.t,self.handle_function)
    self.thread.start()

def start(self):
    self.thread.start()

def cancel(self):
    self.thread.cancel()

def NotIdle():
    Doing something here

t = perpetualTimer(10, NotIdle)
t.start()

(this particular code i copied straight from someone here)

And voila. Now both the server and the client work and have their own error checking function.

I'm putting this answer here in case someone encounters a similar problem.

Thanks!

Deufo
  • 9
  • 3