1

I want to encrypt data read from file which is sent using cipheriv and then compress using gzipstrem and write in a file stream. but when I try to do this I am getting some error

stream.on is not a function
    at destroyer (internal/streams/pipeline.js:27:10)
    at internal/streams/pipeline.js:79:12
    at Array.map (<anonymous>)
    at pipeline (internal/streams/pipeline.js:76:28)
    at Promise (internal/util.js:274:30)
    at new Promise (<anonymous>)
    at pipeline (internal/util.js:273:12)
        on line of pipeLinePromisified.

can someone explain what is wrong here ??


const fs = require("fs");
const util = require("util");
const { pipeline } = require("stream");

const writeStreamPromisified = util.promisify(fs.createWriteStream);

function someFunction(req) {
    //extract bnary data req.on('data')
    req.on('end', async () => {
        let fileName = 'something.gz';

        //define streams and cipher with the secret key

        const iv = crypto.randomBytes(16);
        const secret = req.headers["secret"];
        const cipher = crypto.createCipheriv(
            config.algorithmToEncrypt,
            secret,
            iv
        );
        const compressStream = zlib.createGzip();
        let writeStream = fs.createWriteStream(
            `${config.uploadFolder}/${fileName}.gz`
        );

        try {


            //trying to pipe result which is binary data to cipher to be encrypted

            // then compres it using gzipstream and finally write in writestream
            req.body = result; // result is binary data
            await pipeLinePromisified(
                req,
                cipher,
                compressStream,
                writeStream
            );

            return resolve({
                code: 201,
                payload: {
                    message: "Success!",
                    fileName: `${fileName}.gz`,
                    filePath: `somePath`
                }
            });
        } catch (err) {
            console.log(err);
            return reject({
                code: 400,
                payload: {
                    error: `Error`
                }
            });
        }
    })
}
Michał Karpacki
  • 2,588
  • 21
  • 34
  • Hey, so I'm not really sure what's happening here - you have a `req.on('end')` handler there and it seems that after that you try to do some streaming? Why is the `req.body` overwritten, what are you trying to achieve there? If you can confirm that all you need is just pushing the encryped request payload to a file? Then I could answer your question. – Michał Karpacki Apr 08 '20 at 11:48
  • @MichałKapracki req.on('end') I am doing some encryption etc before it req.on('data') I am reading data from the file which is attached to the psot request. why the req.body is overwritten it is because I am parsing the data from the file then I want to overrite req body by already parsed data to encrypt , compress and write in a new file. If I missed something let me know – Rado Harutyunyan Apr 08 '20 at 11:55
  • Ok I see. Well the problem is that you can't really use the data as stream after it already has been consumed. Overwriting `req.body` won't change anything sadly. What's the type of the data? Can you parse it in chunks or do you need all the data to be able to do that? – Michał Karpacki Apr 08 '20 at 12:09
  • @MichałKapracki I need the whole data, data is the content of the file when I parse it, it is just a a string – Rado Harutyunyan Apr 08 '20 at 12:11
  • @MichałKapracki just as info I am parsing the data in chunks but I need to to some parsing to cut the needed content. so after parsing I have string of the file content. and wanted to overwrite req to do not create another stream for pipeing. – Rado Harutyunyan Apr 08 '20 at 12:15
  • I understand - so maybe you could also add an example (I'm not sure if you can post the parser code also) so that I'd see what is done there - depending on that my solution would be totally different. As to the overwriting, it won't work like that and you simply need another stream - but perhaps it can be an actual stream - then it's very memory and cpu efficient, much more than parsing the whole stream. – Michał Karpacki Apr 08 '20 at 14:24

0 Answers0