1

i need to write an azure function in nodejs that compress any files uploaded in the azure blob storage. i have this piece of code that does the job

const zlib = require('zlib');
const fs = require('fs');

const def = zlib.createDeflate();

input = fs.createReadStream('file.json')
output = fs.createWriteStream('file-def.json')

input.pipe(def).pipe(output)

the azure function

the definition of the nodejs function is this

module.exports = async function (context, myBlob) {

where myBlob contains the content of the file.

now the compression use the stream

how can I convert the file content into a stream and save the converted file that in the script above is the output variable as a new file in the blob storage but in another container?

thank you

zanza67
  • 223
  • 4
  • 20

1 Answers1

0

The JavaScript and Java functions load the entire blob into memory which can be accessed using context.bindings.Name. (Where Name is the input binding name as specified in the function.json file. )

For more details, please check Azure Blob storage trigger for Azure Functions

Since the string/content is already in memory, a stream would not be necessary to work with zlib. The below code snippet uses the deflateSync method from zlib to perform the compression.

var input = context.bindings.myBlob;
    
var inputBuffer = Buffer.from(input);
var deflatedOutput = zlib.deflateSync(inputBuffer);

//the output can be then made available to output binding
context.bindings.myOutputBlob = deflatedOutput;

You can refer to the link here for discussion on this topic.