2

I have a file >500 MB to download using sftp connection, I tried using pysptp and getting error SSHException: Server connection dropped:

import pysftp
import sys
myHostname = "dbfiles.xyz.org"
myUsername = "XXXX"
myPassword = "YYYY"
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
with pysftp.Connection(host=myHostname, username=myUsername, password=myPassword,cnopts=cnopts) as sftp:
    print("Connection succesfully stablished ... ")
    localFilePath = 'c:/....'
    remoteFilePath = sftp.listdir('/folder/')
    for filename in remoteFilePath:
        if 'string_to_match' in filename:
            local_path =  localFilePath + filename
            print (filename)
            print (local_path)
            sftp.get("folder/" + filename, local_path)

And getting SSHException: Server connection dropped: EOF error after 18MB of file is downloaded. Is there any way I can put limit on amount to data downloaded or can delay this get process in order to get full file, i tried several ways but because of large file size, unable to download complete file. Any help appreciated.

Hari_pb
  • 7,088
  • 3
  • 45
  • 53

1 Answers1

0

Go to Paramiko library in stfp_file.py and change the MAX_REQUEST_SIZE to 1024. It worked for me. You can find the package here: /home//.local/lib/python3.8/site-packages/paramiko/sftp_file.py