1

I am first sorry to say that I do not have access to gitHub or sourceForge.net, so I can't retrieve solution like this: possible answer

I have half a giga (510199112 bytes) to download by ftp but there is a time-out error:

ftplib.error_temp: 450 Socket write to client timed-out.

I would like to manage my time out butdo not know how to proceed in my current code:

    ftp_host = 'someFTP'
    login_user = 'johnDoe'
    pwd = 'secret'

    print "connection to ftp %s with user %s" %(ftp_host,login_user)
    ftp = ftplib.FTP(ftp_host)
    ftp.set_debuglevel(1)
    ftp.login(login_user,pwd)

    filename = 'bigFile.txt'
    destination = open('C:/mydestination')

    try:
        print "retrieving file %s to %s" % (filename, destination)
        ftp.retrbinary('RETR %s' % filename, destination.write)
    except Exception as e:
       destination.close()
       print "closing connection"
       ftp.quit()
       raise
    print "closing connection"
    ftp.quit()

    destination.close()
Community
  • 1
  • 1
Colonel Beauvel
  • 30,423
  • 11
  • 47
  • 87
  • possible duplicate of [Download big files via FTP with python](http://stackoverflow.com/questions/8323607/download-big-files-via-ftp-with-python) – Stiffo Aug 12 '15 at 08:17
  • Have you tried the socket options in the question that you cited? These same socket options are reused in the github answer that you can not currently access. Since the code (MIT licensed) in the github repository cited in the answer is only a few hundred lines long, it should be straightforward to find a way to get it especially since you seem to have access to stackoverflow... – Paul Aug 12 '15 at 08:31
  • what do you mean by socket options? sorry I am really not too familiar with the concept behind and would be please for further explanations! – Colonel Beauvel Aug 12 '15 at 08:36

1 Answers1

0

you can try using the wget library of python.

import wget
link = 'ftp://example-destination.com/foo.txt'
wget.download(link)