0

I'm writing a script in python and I'm trying to wrap my head around a problem. I've a URL that when opened, downloads a document. I'm trying to write a python script that opens the https URL that downloads this document, and automatically send that document to a server I have opened using python's pysftp module.

I can't wrap my head around how to do this... Do you think I'd be able to just do:

server.put(urllib.open('https://......./document'))

EDIT: This is the code I've tried before the above doesn't work...

download_file = urllib2.urlopen('https://somewebsite.com/file.csv')
file_contents = download_file.read().replace('"', '')
columns = [x.strip() for x in file_contents.split(',')]

# Write Downloaded File Contents To New CSV File
with open('file.csv', 'wb') as f:
    writer = csv.writer(f)
    writer.writerow(columns)

# Upload New File To Server
srv.put('./file.csv', './SERVERFOLDER/file.csv')

ALSO: How would I go about getting a FILE that is ONE DAY old from the server? (Examining age of each file)... using paramiko

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
Jake Z
  • 1,113
  • 1
  • 12
  • 22
  • Try and see & then tell us what happened. :) – ρss May 16 '14 at 20:43
  • Didn't work. I had to use the I did the urllib2.urlopen() into a variable. Then did a read() on it....from there, I parsed it and placed it into a local csv file in which I uploaded to the server....the surely must be a faster, more efficient way. – Jake Z May 16 '14 at 20:53
  • It would be convenient if you could provide your code in your question. – ρss May 16 '14 at 20:54
  • Just updated my question with the code. :) – Jake Z May 16 '14 at 20:57

0 Answers0