I upload files via ssh2-sftp-client by passing a readStream. That works fine, but when I add an event listener the files are not completely uploaded:
This works:
const stream = fs.createReadStream(myFile);
return this.sftpClient.put(stream, remoteFilePath)
This only uploads 80% of the file
const stream = fs.createReadStream(myFile);
stream.on('data', (chunk) => {
console.log('chunk passed');
});
return this.sftpClient.put(stream, remoteFilePath)
Last message in debug mode shows no error:
chunk passed
Outbound: Sending CHANNEL_DATA (r:0, 9380)
SFTP: Outbound: Sent WRITE (id:1554)
Inbound: CHANNEL_DATA (r:0, 28)
SFTP: Inbound: Received STATUS (id:1554, 0, "Success")
Outbound: Sending CHANNEL_DATA (r:0, 17)
SFTP: Outbound: Buffered CLOSE
Inbound: CHANNEL_DATA (r:0, 28)
SFTP: Inbound: Received STATUS (id:1555, 0, "Success")
CLIENT[sftp]: put: promise resolved
CLIENT[sftp]: put: Removing temp event listeners
Same problem when piping a stream:
const stream = fs.createReadStream(myFile);
const fileMd5Hash = crypto.createHash('md5').setEncoding('hex');
stream.pipe(fileMd5Hash);
return this.sftpClient.put(stream, remoteFilePath)
It seems the problem lies in the fact that onData event and pipe() both switch the stream into flowing mode which might cause data loss since data is immediately "as quickly as possible" passed to the aplication. So I wonder: how could I read the chunks to create my md5sum hash from the whole file without switching the read stream to flowing mode?