0

I am trying to automate continuous data copying from my data base to an s3 bucket following the tutorial "How to Automate Continuous Data Copying from MongoDB to S3".

I created a federated data base called "FederatedAnalyticsInstance" using "analytics" data base and "events" collection on my cluster dev. I am now trying to get my trigger function to work.

exports = function () {

const datalake = context.services.get("FederatedAnalyticsInstance");
const db = datalake.db("analytics-test");
const coll = db.collection("events");

const pipeline = [
{
$match:{
"created" : { $gt:new Date('2020-01-01T00:00:00.00+00:00'), $lt:new Date('2020-01-13T23:59:59.999+00:00') }
}
}
,
{
"$out": {
"s3": {
"bucket": "322104163088-mongodb-data-ingestion",
"region": "eu-west-2",
"filename": "analytics/",
/*"filename": { "$concat": [
"analytics",

                  "$_id"
                ]
              },*/
        "format": {
          "name": "json",
          "maxFileSize": "100GB"
        }
      }
    }
  }
];

return coll.aggregate(pipeline);
};

When I run this, I get:

error: (InternalError) an internal error occurred, correlationID = 171cda4b13639900d898e825

When I connect to "FederatedAnalyticsInstance" on vscode and query this:

`const database = 'analytics-test'; const collection = 'events';

// Create a new database. use(database);

db.events.find({"created" : { $gt:new Date('2020-01-01T00:00:00.00+00:00'), $lt:new Date('2020-01-13T23:59:59.999+00:00') } }) ` I get two documents as results.

I have also tried many different function, but they return empty and no data is copied to my s3 bucket.

What do I do?

mstolet
  • 11
  • 2

0 Answers0