6

I can't figure out how to pick between ftplib and urllib2 for downloading a file over FTP. As far as I can tell, they work comparably well, and most recommendations just seem to presume one or the other without listing pros/cons. Furthermore, there are other libraries to try, too.

How should I pick in general? It's not just a matter of finding something that works, I've done that already two different ways. I'm wondering what the Right choice is. My basic thought is to use ftplib since it is after all an ftp server...

In my specific case, I'm dealing with very large files (several gb) on public servers. I'd like to get a progress bar of some sort. Since the files are so large, it would be nice to have a resume capability... but those are optional -- just to be nice to people who will use my code. I don't want to use a library that is already marked for deprecation.

amos
  • 5,092
  • 4
  • 34
  • 43
  • I'd also consider [`pyftplib`](http://code.google.com/p/pyftpdlib/) which is used by projects such as bazaar and Chromium. – Bakuriu Mar 26 '13 at 17:33
  • Ok this is not directly relevant, but do you consider an out-of-python solution such as the famous rsync? Because for that size of file, the transmit would better happen in the background, i.e. asynchornized, anyway. – RayLuo Mar 26 '13 at 19:10
  • my first solution was to do curl through subprocess, but that was failing for some reason, so I figured a more pythonic way would give better error reporting, timeouts, etc. It's necessary for my script to wait until the download completes, because afterwards I run tests using the file. – amos Mar 26 '13 at 19:54

0 Answers0