I'm new to lambda and API Gateway. I have done simple things using this services but right now I have a requirement where a user can upload an excel file. The upload process simply triggers an API gateway endpoint which is integrated with a lambda function. I don't want to save the file, I just want to send the file data as CSV/JSON format to my lambda function (nodejs code) and then process/transform/persist the data. Does anybody know how to do it? I've tried a few thing but I am really lost.
5 Answers
As I said in my question I don't need to save the file. I pass the file data through the API Gateway (proxy mode) to the lambda function, parse the data using parse-multipart
and parse the parts buffer using node-xlsx
to reach to the readable data.

- 1,271
- 2
- 26
- 56
You could convert your file into a Buffer
object and stream the upload (using fs
also works) to your api. Once you receive it in your lambda as part of your params, you could then convert it back again to csv and write it off/upload to S3/convert to base64 and store in db, whichever approach your architecture suggests.

- 1,065
- 1
- 9
- 26
I would not bring API Gateway into the equation. There are size limitations on your upload to API Gateway.
Browser -- CloudFront -- S3 Bucket (Trigger) -- Lambda
If you want to secure your endpoint to upload to S3 you can use SignedURL to protect your endpoint.
Browser -- CloudFront -- APIGW -- Lambda (Returns SignedURL) -- Upload with the above steps
Once the file is uploaded, you can read the S3 object / file from Lambda and process it from there.

- 12,554
- 3
- 44
- 83
In my projects, this is how I do it:
- Frontend converts the file into a base64 string.
- Base64 string is then included in a JSON payload.
- JSON payload is sent to an API Gateway
POST
endpoint. - API Gateway triggers a Lambda handler that will then convert the Base64 string into a nodejs
Buffer
object. Buffer
object is passed toaws-sdk
'sS3.upload()
method which returns the S3 key of the uploaded object.- I would then save this S3 key into the database.
Take note that this approach has a limit on the file size (API Gateway's limit) so my frontend checks for this limit before sending to the API.
If you want to go over API Gateway's limit, you have to upload to S3 directly from the client-side or frontend.

- 15,018
- 3
- 57
- 81
I was stucked on this task for a while and this is the final working solution,
Create your api endpoint through API Gateway.
Create a route that will trigger the lambda function responsible for upload/get file.
Now go to your AWS Console -> API Gateway -> XYZ API (the api you just created) -> Settings enter image description here
Scroll down to the end of the page, go to Binary Media Types and write
*/*
.Go to Resources in your API, select your route, click on ANY, select Method Response, and then final step is to set Content Type in response model
application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
At this point you have configured the initial setup, now it's time to save and retrieve xlsx file from s3 bucket through lambda function.
Upload File Code
Send the file in binary format (for testing purpose use postman). Make sure the lambda function you are using have the required permissions.
const s3 = new AWS.S3();
await s3.createBucket({ Bucket: "Your_Bucket_Name"}).promise();
const saveFileResponse = await s3
.putObject({
Bucket: "Your_Bucket_Name",
Body: Buffer.from(event.body, "base64"),
Key: "test-file.xlsx",
ContentType: event.headers["Content-Type"]
})
.promise();
return {
statusCode: 200,
body: JSON.stringify("New file stored successfully"),
};
Get File Code
const s3Object = await s3
.getObject({
Bucket: "Your_Bucket_Name",
Key: "test-file.xlsx",
})
.promise();
const responseBody = s3Object.Body.toString("base64");
return {
statusCode: 200,
body: responseBody,
headers: {
"Content-Type": s3Object.ContentType,
"Access-Control-Allow-Origin": "*",
"Content-Disposition": "attachment; filename=test-file.xlsx",
},
isBase64Encoded: true,
};
Hope this helps :)

- 1
- 2