1

I have the following function that upload a file to a S3:

import {
  PutObjectCommand,
  S3Client,
} from "@aws-sdk/client-s3";

const client = new S3Client({
  endpoint: "https://fra1.digitaloceanspaces.com",
  region: "fra1",
  credentials: {
    accessKeyId: Env.get("BUCKET_KEY"),
    secretAccessKey: Env.get("BUCKET_SECRET"),
  },
});

const Bucket = "customers-space";

const Key = "customers.csv";

export const saveFileInSpace = async (path: string) => {
  const Body = fs.createReadStream(path);
  await client.send(
    new PutObjectCommand({
      Bucket,
      Key,
      ACL: "private",
      Body,
    })
  );
};

My file will be quite big (only ~50MB now, but will increase) so I used streams like the aws example.

However the call fails with the following:

MissingContentLength: UnknownError
     at throwDefaultError (/workspace/node_modules/@aws-sdk/smithy-client/dist-cjs/default-error-handler.js:8:22)
     at deserializeAws_restXmlPutObjectCommandError (/workspace/node_modules/@aws-sdk/client-s3/dist-cjs/protocols/Aws_restXml.js:5782:43)
     at runMicrotasks (<anonymous>)
     at processTicksAndRejections (node:internal/process/task_queues:96:5)
     at /workspace/node_modules/@aws-sdk/middleware-serde/dist-cjs/deserializerMiddleware.js:7:24
     at /workspace/node_modules/@aws-sdk/middleware-signing/dist-cjs/middleware.js:14:20
     at StandardRetryStrategy.retry (/workspace/node_modules/@aws-sdk/middleware-retry/dist-cjs/StandardRetryStrategy.js:51:46)
     at /workspace/node_modules/@aws-sdk/middleware-flexible-checksums/dist-cjs/flexibleChecksumsMiddleware.js:56:20
     at /workspace/node_modules/@aws-sdk/middleware-logger/dist-cjs/loggerMiddleware.js:6:22
     at saveFileInSpace (/workspace/build/index.js:37579:3)
     at exportCsv (/workspace/build/index.js:39452:5)
     at processResults (/workspace/build/index.js:39709:8)
     at bulkFetchCustomers (/workspace/build/index.js:39759:7)
     at Worker.Queue.lockDuration [as processFn] (/workspace/build/index.js:39738:7)
     at Worker.processJob (/workspace/node_modules/bullmq/src/classes/worker.ts:644:22)
     at Worker.retryIfFailed (/workspace/node_modules/bullmq/src/classes/worker.ts:774:16)
     at Worker.run (/workspace/node_modules/bullmq/src/classes/worker.ts:370:34) {
   '$fault': 'client',
   '$metadata': {
     httpStatusCode: 411,
     requestId: undefined,
     extendedRequestId: undefined,
     cfId: undefined,
     attempts: 1,
     totalRetryDelay: 0
   },
   Code: 'MissingContentLength',
   BucketName: 'customers-space',
   RequestId: '...',
   HostId: '...'
}

From the examples I found online no one used the ContentLength parameters.

I also use a stream, that does not have idea of his size.

Version of the sdk:

"dependencies": {
  "@aws-sdk/client-s3": "3.218.0",
  ...
}

What am I doing wrong?

Maurice
  • 11,482
  • 2
  • 25
  • 45
Nullndr
  • 1,624
  • 1
  • 6
  • 24
  • You always need to specify the content length for S3 (API requirement) - you just need to get it all into memory and calculate the size espeically. How much is 'quite big'? It may be worth to buffer and use multi part uploads depending on the size. – Ermiya Eskandary Dec 21 '22 at 15:31
  • @ErmiyaEskandary so why in the aws user guide there was no `ContentLength`? Previously I also updated it successfully. – Nullndr Dec 21 '22 at 15:34
  • Can you provide a link please? – Ermiya Eskandary Dec 21 '22 at 15:34
  • @ErmiyaEskandary a link for what? For the user guide? https://docs.aws.amazon.com/AmazonS3/latest/userguide/upload-objects.html under the `Using the AWS SDK` -> javascript – Nullndr Dec 21 '22 at 15:36
  • Ultimately, the client will buffer it to calculate the content length - which version of the SDK are you using? Are you sure your path is valid? – Ermiya Eskandary Dec 21 '22 at 15:40
  • @ErmiyaEskandary, you mean the `@aws-sdk/client-s3` package version? It is the `3.218.0`. I am sure about the path being valid – Nullndr Dec 21 '22 at 15:43
  • You're not using S3, but rather using a S3 compatible API. It appears to have subtly different requirements for the API than S3 does, so the AWS SDK won't work. You should ask Digital Ocean what their supported SDKs are. – Anon Coward Dec 21 '22 at 16:59

0 Answers0