I'm using Node.js to extract some large files (sometimes larger than 2 GB).
Not all the data in these files are needed hence why, is order to avoid downloading the whole file, i'm currently using sshpass command combined with cat and grep on whatever i need from the file but i'm facing issues when the output of the cat | grep
command is empty.
To be consistent, i want to stop using sshpass and instead use ssh2-promise package to execute the cat grep command and write the output of the command to a file to be processed, transformed into a csv file and loaded in a certain table in my database (mariadb)
The issue i'm having is that whenever the output of the command is larger than 512 MB the data gets truncated and i lose a lot of records. This is how i'm using the package.
import SSH2Promise = require('ssh2-promise');
const id = randomBytes(16).toString('hex');
const ssh = new SSH2Promise({ uniqueId: id, ...sshConfig });
const command = 'cat fileName | grep "match"'
const commandResult = await ssh.exec(command);
await fs.promises.writeFile(filePath, commandResult);
How can I ensure that the output of the ssh2-promise package isn't truncated when the output exceeds 512 MB?
I've considered handling these files in chunks and using grep on whatever i'm searching for, but doing so i have to make multiple ssh connections and some records will be lost.
The only logical answer that my research lead to is to increase the highWaterMark but it seems that this is not a valid option to pass as the output is not changing.
so any help would be appreciated.