0

I have an AWS lambda that is returning a response which is more than 6MB in size. (I know its not ideal and is limiting the AWS lambda limit which is 6MB for RequestResponse lambda invocation type).

I get the below error in this case :

413 Request entity too large

I want to compress this response before sending it to the API Gateway. My lambda is in javascript, I am using zlib to compress the response in gzip format but I am getting async error.

Code snippet :

const invocationResponse = await zlib.gzip(JSON.parse(data.Payload),function (error, result) {    
 if (error) throw error    return result
})

Error :

ProcessingLambdaInvoker: Error (asynchronously) invoking backend lambda:  TypeError [ERR_INVALID_ARG_TYPE]: The "chunk" argument must be one of type string or Buffer. Received type object
    at validChunk (_stream_writable.js:272:10)
    at Gzip.Writable.write (_stream_writable.js:307:21)
    at Gzip.Writable.end (_stream_writable.js:617:10) 

Can someone suggest a better way to compress the response in js and send it back. Also do I have to decompress it on the frontend or will the browser automatically do it. I know that I will have to compress it and then encode it using base64 before sending it.

Vijay8608
  • 25
  • 8
  • 1
    If there is hard limit on AWS lamda, then why not save the zip to s3 location, and share the s3 location in response. Client can make use of s3 path shared to download the file. – Dreamweaver Feb 27 '20 at 02:59
  • thats just one of the ways to do it and I have already worked on doing it this way. I think its a hard limit. – Vijay8608 Feb 28 '20 at 00:17

1 Answers1

0

Please notice the error type ERR_INVALID_ARG_TYPE, use JSON.stringify instead.

Correct

zlib.gzip(JSON.stringify(data.Payload), ...)

Wrong

zlib.gzip(JSON.parse(data.Payload), ...)
Kim-Jimin
  • 674
  • 2
  • 9
  • 19