0

Requirement:

I have multiple files in a folder on my express server. I pass these file names as an API call, the backend function needs to read all these files, upload them to AWS S3 and then return an array of public URLs.

const s3 = new aws.S3();
const fs = require("fs");


module.exports = {
  upload: async function (req, res, next) {
    console.log("Inside upload controller");
    let publicUrls = [];

    const result = await Promise.all(
      req.body.files.map(async (uploadFile) => {
        fs.promises.readFile(uploadFile).then((file) => {
          let body = fs.createReadStream(uploadFile);
          const s3PutParams = {
            Bucket: process.env.S3_BUCKET_NAME,
            Key: uploadFile.substr(15),
            Body: body,
            ACL: "public-read",
          };

          s3.upload(s3PutParams)
            .promise()
            .then((response) => {
              console.log(response.Location);
              publicUrls.push(response.Location);
            });
        });
      })
    );
    if (result) {
      console.log("Result", result);
      res.json(publicUrls);
    }
  },
  
};

Observed Output:

Inside upload controller
Result [ undefined, undefined, undefined, undefined ]
https://xxx.s3.amazonaws.com/2_30062022.pdf
https://xxx.s3.amazonaws.com/1_30062022.pdf
https://xxx.s3.amazonaws.com/1_30062022.pdf
https://xxx.s3.amazonaws.com/2_30062022.pdf

I am passing an array of 4 file names, hence 4 "undefined" while logging "result"

enter image description here Issue: The code is not awaiting for the Promise.all to be completed. It right away returns the json response, which is an empty array at that point.

How can this be resolved?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470

1 Answers1

0

Solved by referring to NodeJS write file to AWS S3 - Promise.All with async/await not waiting

enter image description here