0

I have developed a function on Lambda which results into a list of json objects i.e. a JSON array. Also the JSON array is traversed and converted to an Excel sheet using Node.JS library 'excel4node'

I have also configured SES for the current Lambda function. But the issue here I'm facing is to send the workbook generated by excel4node library to a verified email address from SES

I couldn't find a way to get the path to which the workbook is saved and send that object as an attachment using SES

Code:

var AWS = require('aws-sdk');
var ses = new AWS.SES({
    region: 'us-west-2'
});

var excel = require('excel4node');
var workbook = new excel.Workbook();

// Add Worksheets to the workbook
var worksheet = workbook.addWorksheet('Sheet 1');

  workbook.write('Excel.xlsx'); //How to send this workbook

var eParams = {
            Destination: {
                ToAddresses: ["dest@example.com"]
            },
            Message: {
                Body: {
                    Text: {
                        Data: JSON.stringify(res) // For now over here I'm sending just the JSON array response variable in the body
                    }
                },
                Subject: {
                    Data: "Email Notification"
                }
            },
            Source: "source@example.com"
        };
        console.log('===SENDING EMAIL===');
          var email = ses.sendEmail(eParams, function(err, data) {
            if (err) console.log(err);
            else {
                console.log("===EMAIL SENT===");
                // console.log(data);
                console.log("EMAIL CODE END");
                console.log('EMAIL: ', email);
            }
        });
John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
Matey Johnson
  • 223
  • 1
  • 2
  • 12

1 Answers1

1

You first need to upload it to your bucket. You can use the wb.writeToBuffer() method from the excel4node docs to generate a buffer to be sent to the s3.upload() method, since the s3.upload() method accepts "an arbitrarily sized buffer, blob, or stream, using intelligent concurrent handling of parts if the payload is large enough", from aws docs.

wb.writeToBuffer().then(function(buffer) {
   var params = {
      Bucket: 'your-bucket-path', 
      Key: 'your-file-name.ext', 
      Body: buffer
   };
   s3.upload(params, function(err, data) {
      console.log(err, data);
   });
});