1

I want to take a .csv file from Google cloud storage and send it to SFTP server. I do not know, what I am doing wrong, but I am getting the error that file not found in the cloud storage. My code is below:

import base64
import os
import pysftp
import re
import csv
from google.cloud import storage
from google.cloud import bigquery

def hello_sftp(event, context):
    
    #defining credentials for the transfer
    myHostName = 'lmno.abcd.com'
    myUsername = 'pqr'
    myPassword = 'xyz'
    filename = 'test_file.csv'
    path = "gs://testing/"

    copy_file_on_ftp(myHostName, myUsername, myPassword, filename, path)
   
def copy_file_on_ftp(myHostName, myUsername, myPassword, filename, localpath):
    
    remotepath = '/Export/' + str(filename)
    print(' ')
    print(localpath)
    print(' ')
    cnopts = pysftp.CnOpts()
    cnopts.hostkeys = None
    
    with pysftp.Connection(
    host=myHostName, username=myUsername, password=myPassword, cnopts=cnopts
    ) as sftp:
        print("Connection successfully established . . . ")
        print("Exporting . . . ")
        print("local path and file name is : ", localpath+filename)
        sftp.put(localpath =localpath+filename, remotepath=remotepath)
    sftp.close()
    print("export to sFTP successful!")

but I get the error

FileNotFoundError: [Errno 2] No such file or directory: 'gs://testing/test_file.csv'

Is it not possible to send the data there?

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
sdave
  • 531
  • 4
  • 18

1 Answers1

0

The pysftp does not understand gs://.

Instead, use Google API to download the file to SFTP file-like object representing the target file on the SFTP server opened with the pysftp:

storage_client = storage.Client()

bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
with sftp.open(remotepath, 'w', 32768) as f:
    blob.download_to_file(f)
Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992