I have a csv file that contains file paths such as: oa_package/08/e0/PMC13900.tar.gz, oa_package/b0/ac/PMC13901.tar.gz, etc.
If one were to type ftp.ncbi.nlm.nih.gov/pub/pmc/oa_package/08/e0/PMC13900.tar.gz into a browser, it would automatically download the file, is there a way to replicate this behavior in python?
Previously, I have successfully used the following try/catch along with host.listdir('.') to iterate and download files from the same directory. In this situation, I am intending to download files outside of the current working directory.
host = ftputil.FTPHost('ftp.ncbi.nlm.nih.gov', 'anonymous', email)
path = "pub/pmc/"
host.chdir(path) #gets to ftp.ncbi.nlm.nih.gov/pub/pmc/
outFile = outDir + file
print("retrieving file:", file)
try:
host.download(file, outFile) #file is zip file
except:
print("WARNING: File could not be downloaded")
The traceback error is:
FTPOSError: [Errno 11001] getaddrinfo failed
Debugging info: ftputil 3.4, Python 3.6.5 (win32)