1

I have the following code to upload files received by my API (sent using a formidable form), which works perfectly fine in my dev environment:

const product = {

    post: async (req, res) => {
      await dbConnect()

      const form = new formidable.IncomingForm({
        multiples: true,
        keepExtensions: true,
      })

      const s3 = new S3({
        accessIdKey: process.env.ACCESS_KEY_AWS,
        secretAccessKey: process.env.SECRET_KEY_AWS,
      })
      
      form.parse(req, async (error, fields, data) => {
        
        if (error) {
          return res.status(500).json({ success: false })
        }

        const { files } = data

        const filesToUpload = files instanceof Array
          ? files
          : [files]

        let filesToSaveOnDb = []

        async function uploadFile(filesToUpload) {
          for(let file of filesToUpload) {
            try {
              const timestamp = Date.now()
              const random = Math.floor(Math.random() * 999999999) + 1
              const extension = path.extname(file.name)

              const Key = `${timestamp}_${random}${extension}`

              const fileToUpload = fs.readFileSync(file.path)

              const uploadedImage = await s3.upload({
                Bucket: process.env.BUCKET_NAME,
                Key,
                Body: fileToUpload,
                ContentType: "image/*"
              }).promise()

              filesToSaveOnDb.push({
                name: Key,
                path: `${uploadedImage.Location}`,
              }) 
[...rest of the code...]

My code is hosted on AWS Amplify.

As I stated, this code works as intended when running with "npm run dev" on my local machine.

On production, however, the product is saved but the images are not uploaded to S3. On CloudWatch logs, the following error is thrown: Error: CredentialsError: Missing credentials in config, if using AWS_CONFIG_FILE, set AWS_SDK_LOAD_CONFIG=1

What I already tried and checked:

  • environment variables as correctly setup on Amplify

  • Build settings on Amplify have the following line to pass the env vars to production:

    • env | grep -e MONGODB_URI -e APP_URL -e NEXTAUTH_URL -e NEXTAUTH_SECRET -e SECRET_KEY_AWS -e BUCKET_NAME -e ACCESS_KEY_AWS >> .env.production
  • debugged with console.log and the content of the environment variables are showing on CloudWatch logs. This means the code is able to access the env variables.

  • S3 Bucket is set to public access.

  • IAM user (holder of the access key and secret key) have "AmazonS3FullAccess" permission.

  • Using the S3 JS SDK v3: a misleading error is shown, which, from my research, also refers to credentials not being present.

  • Tried setting AWS configuration inline, no success:

    AWS.config.update({
      accessIdKey: process.env.ACCESS_KEY_AWS,
      secretAccessKey: process.env.SECRET_KEY_AWS,
      region: "sa-east-1",
    })
    

I'm really lost on what the problem may be.

João Textor
  • 103
  • 1
  • 9

1 Answers1

0

So, I realized that does not matter if I use AWS.configure.update or pass my credentials when instantiating the S3 class. The credential (at least when hosting my app with Amplify) used will be the one that I used to set up Amplify App (amplify init).

This way, I can get rid of the environment variables.

To solve the issue of "Missing credentials", which is somewhat misleading, I had to access the IAM Management Console, click on "Roles" and select the Amplify Unauthenticated Role (as I'm not uploading with Cognito authentication), and create a Inline Policy with the following:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::{BUCKET_NAME}/public/*"
            ],
            "Effect": "Allow"
        }
    ]

In the Bucket Policies, I unchecked "Block public access" and set the following inline policy:

{
    "Version": "2012-10-17",
    "Id": "ExamplePolicy01",
    "Statement": [
        {
            "Sid": "ExampleStatement01",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": [
                "s3:*",
            ],
            "Resource": [
                "arn:aws:s3:::{BUCKET_NAME}/*",
                "arn:aws:s3:::{BUCKET_NAME}"
            ]
        }
    ]
}

This solved the issue on production. I don't know, however, why the same code already worked in my local machine.

João Textor
  • 103
  • 1
  • 9