0

I'm working on a nuxt3 project to allow users to upload files(img, video, audio) and upload to AWS S3 at the same time.
In the vue file, I declare a formData and append the file and other parameters to the formData.

index.vue

let formData = new FormData();
  formData.append(
    "Key",
    `${config.public.UPLOAD_PATH}${uuid.value}/${file.name}`
  );
  formData.append("Body", file);
  formData.append("ContentType", file.type);
  const response = await useFetch("/api/s3", {
      method: "post",
      body: formData,
    });

In s3.js, the formData is deconstructed through readMultipartFormData, and the file is uploaded to S3 through S3Client PutObjectCommand

  const body = await readMultipartFormData(event);
  const secretAccessKey = config.s3SecretKey;
  const accessKeyId = config.s3AccessKey;
  const region = config.s3Region;
  const bucket = config.s3Bucket;
  const client = new S3Client({
    region,
    credentials: {
      secretAccessKey,
      accessKeyId,
    },
  });

  const command = {
    Bucket: bucket,
    Key: body[0].data.toString(),
    Body: body[1].data,
    ContentType: body[2].data.toString(),
    Tagging: "expiration-days=1d",
  };
  const s3Command = await new PutObjectCommand(command);

  return client.send(s3Command);

This code works locally and the file will indeed be uploaded to S3.
But when I build and deploy the nuxt3 project on lambda, the following error will appear. enter image description here
Will not find the data in the body.
Want to ask why formData cannot be parsed through readMultipartFormData on lambda?
Will the body be encoded by base64?

AngusHo53
  • 51
  • 1
  • 6

2 Answers2

0

I am assuming that you are uploading a single file only.

I think the Body value should be Body: body[0].data, instead of Body: body[1].data,

// Incorrect 
 const command = {
    Bucket: bucket,
    Key: body[0].data.toString(),
    Body: body[1].data,
    ContentType: body[2].data.toString(),
    Tagging: "expiration-days=1d",
 };
 
 // ✅ Correct
  const command = {
    Bucket: bucket,
    Key: body[0].data.toString(),
    Body: body[0].data,
    ContentType: body[0].type
    Tagging: "expiration-days=1d",
  };
ReaganM
  • 1,290
  • 1
  • 11
  • But body[0] is "KEY" and the file is in body[1]. – AngusHo53 May 08 '23 at 07:43
  • Are you uploading a single file only? – ReaganM May 08 '23 at 07:44
  • yes. I only uploading a single file. – AngusHo53 May 08 '23 at 07:44
  • I edited my answer. I did test your code from uploading from the client side using your API and I am able to see data inside the `readMultipartFormData(event)` method – ReaganM May 08 '23 at 07:52
  • I can indeed run it successfully locally, but after deploying the lambda, an error will appear. – AngusHo53 May 08 '23 at 07:54
  • Have you tried this codes and run it? `const command = { Bucket: bucket, Key: body[0].data.toString(), Body: body[0].data, ContentType: body[0].type Tagging: "expiration-days=1d", };` If you are still seeing the issue. I think the issue is something related to the lambda configuration. – ReaganM May 08 '23 at 07:57
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/253530/discussion-between-reaganm-and-angusho53). – ReaganM May 08 '23 at 07:58
  • yes, still have this issue. – AngusHo53 May 08 '23 at 07:58
0

I solved my problem!
The problem is that aws will encode event to base64.
Just need to decode in the code.

  const new_event = deepCopy(event)
  new_event.node.req.body = Buffer.from(event.node.req.body, 'base64').toString('ascii');

  const body = await readMultipartFormData(new_event);
  console.log(body)

  function deepCopy(obj) {
  if (typeof obj !== 'object' || obj === null) {
    return obj;
  }

  let copy = Array.isArray(obj) ? [] : {};

  Object.keys(obj).forEach(key => {
    copy[key] = deepCopy(obj[key]); 
  });

  return copy;
}
AngusHo53
  • 51
  • 1
  • 6