11

Testing an s3 upload? The method to test is

export class ProcessData {
  constructor() {}

  async process(): Promise<void> {
     const data = await s3Client.send(new GetObjectCommand(bucket));
     await parseCsvData(data.Body)
}

This is my attempt at the test case.

import {S3Client} from '@aws-sdk/client-s3';
jest.mock("aws-sdk/clients/s3", () => {
  return {
    S3Client: jest.fn(() => {
        send: jest.fn().mockImplementation(() => {
            data:  Buffer.from(require("fs").readFileSync(path.resolve(__dirname, "test.csv")));
        })
    })
  }
});

describe("@aws-sdk/client-s3 mock", () => {
  test('CSV Happy path', async () => {
    const processData = new ProcessData()
    await processData.process()
  }
}

The process gets to the parse method and throws an error "The bucket you are attempting to access must be addressed using the specific endpoint"

Interlated
  • 5,108
  • 6
  • 48
  • 79

3 Answers3

13

For anyone who wants to mock the client directly, you can use the library aws-sdk-client-mock which is recommended by the AWS SDK team.

Here is an introductory tutorial

The initial steps:

import fs from 'fs';

import { GetObjectCommand, S3Client } from '@aws-sdk/client-s3';
import { mockClient } from 'aws-sdk-client-mock';

const mockS3Client = mockClient(S3Client);

And then you can mock it this way

mockS3Client.on(GetObjectCommand).resolves({
  Body: fs.createReadStream('path/to/some/file.csv'),
});

You can also spy on the client

const s3GetObjectStub = mockS3Client.commandCalls(GetObjectCommand)

// s3GetObjectStub[0] here refers to the first call of GetObjectCommand
expect(s3GetObjectStub[0].args[0].input).toEqual({
  Bucket: 'foo', 
  Key: 'path/to/file.csv'
});

Update May 6th, 2023

Since version 3.188.0 (Oct 22), the S3 client supports util functions to consume and parse the response streams.

So now, to mock the response, we need to wrap it with the util function sdkStreamMixin()

import fs from 'fs';

import { GetObjectCommand, S3Client } from '@aws-sdk/client-s3';
import { sdkStreamMixin } from '@aws-sdk/util-stream-node';
import { mockClient } from 'aws-sdk-client-mock';

const mockS3Client = mockClient(S3Client);
const stream = sdkStreamMixin(fs.createReadStream('path/to/some/file.csv'))

mockS3Client.on(GetObjectCommand).resolves({
  Body: stream ,
});
akmalmzamri
  • 1,035
  • 1
  • 15
  • 31
  • 1
    Any chance you could expand on the type that is used there for the 'Body' attribute in the mocked resolve? I am having trouble reconciling it. See new qn here: https://stackoverflow.com/questions/76166889/mocking-getobjectcommand-response-using-v3-of-the-sdk-using-file-system-object. – tmc May 03 '23 at 18:18
1

Applying the example worked:

https://github.com/awsdocs/aws-doc-sdk-examples/blob/master/javascriptv3/example_code/s3/tests/s3_putbucketpolicy.test.js

import {s3Client} from '../clients/s3Client';
jest.mock("../clients/s3Client.ts");

describe("@aws-sdk/client-s3 mock", () => {
  test('CSV Happy path', async () => {
    // @ts-ignore
    s3Client.send.mockResolvedValue({
      Body: Buffer.from(require("fs").readFileSync(path.resolve(__dirname, "nrl-small2.csv")))
    })
 })
Audwin Oyong
  • 2,247
  • 3
  • 15
  • 32
Interlated
  • 5,108
  • 6
  • 48
  • 79
1

Moto

I recommend using Moto https://github.com/spulec/moto

It provides a standalone server, that implements many AWS services' endpoints. http://docs.getmoto.org/en/latest/docs/getting_started.html#stand-alone-server-mode

Docker container: https://hub.docker.com/r/motoserver/moto

How to use

Run the server as a docker container and expose the aws endpoint as an environment variable to your local environment and pass this endpoint to the s3 client. Use region us-east-1 to avoid 409 error on S3 bucket creation.

const env = process.env.ENVIRONMENT;
const endpoint = env === 'dev' ? process.env.AWS_ENDPOINT : '';
const s3 = new S3Client({ endpoint });

Then all your s3 requests will go to your local Moto server. While the server is running, it will store objects and other state, making it possible to upload and download objects and verify their integrity.

The best part is, that you can run a big part of the AWS environment locally and run integration tests locally. This will speed up development time, document input and output from that API and makes debugging prod issues locally much easier.

Example

A lambda function is triggered upon creation of an s3 object and writes output to s3 and a mysql database.

before runnging:

  • load the test data and write it to s3
  • create a docker mysql database with needed schema and mock data

run:

  • execute the lambda function with the sns event test data

after:

  • read the content from the s3 output object to verify its content
  • read the content from the mysql database to verify the content
wasserholz
  • 1,960
  • 2
  • 20
  • 26