0

We have a MongoDB database of 600+ GB which includes 6 Collections hosted on AWS EC2 Ubuntu 14 Instance (m4.2xlarge). I have 2 queries:

1) How to backup 600GB of database safely?

2) How can I archive the MongoDB Database periodically without any data loss?

Pritish Pandey
  • 129
  • 1
  • 8
  • [Supported backup methods](https://docs.mongodb.com/manual/core/backups/) are covered in the MongoDB server documentation. Recommended approaches may vary depending on your deployment type (standalone, replica set, or sharded cluster) and MongoDB server version, but with 600+ GB I assume you'd want to use either continuous backup or filesystem/EBS snapshots. What is your specific version of MongoDB server and what type of deployment do you have? Can you elaborate on what you mean by archiving periodically -- does that refer to saving old backups, exporting old data, or something else? – Stennie Nov 23 '19 at 04:56
  • Thanks, Stennie. We have a standalone type deployment and our EBS volume contains all the code/services/Postgre Data and Mongo data. Our 90% of the 1 TB EBS Volume is used up. So what we want is: 1) Take a backup of the huge 600 GB database/collections and save in S3 bucket. 2) Later on, take monthly backups keeping only last 6 months data in the database, moving backups to S3 bucket. This way we will never have to move beyond a 1 TB EBS Volume. Mongo version is 3.0.8. – Pritish Pandey Nov 27 '19 at 11:10

0 Answers0