I am trying to read a file of size around 1GB, from an S3 bucket. My objective is to read the data from the file and send it over to another server.
At the moment when I try to read a large file(1GB) my system hangs up/server crashes. I am able to console out the data of a 240MB file with the following segment of code
var bucketParams = {
Bucket: "xyzBucket",
Key: "input/something.zip"
};
router.get('/getData', function(req, res) {
s3.getObject(bucketParams, function(err, data) {
if (err) {
console.log(err, err.stack); // an error occurred
}
else {
console.log(data); // successful response
}
});
// Send data over to another server
});
How would it work, when it comes to reading large files from S3?