0

I've issue with dumping data from mongo container into swarm. I can't use run into swarm, I can't connect other container (run'mongodump because main network not manually attachable). I googled this issue and I've found only solutions with docker-compose --link which doesn't work in swarm.

My plane was:

  1. Run other mongo container with command mongodump --host main_mongo_container --out some_volume.
  2. Compress dump into tar
  3. Upload dump onto S3.
  4. Run script in cron.

I don't have enough experience for solving this issue myself. Had anyone experience in automatization dumping mongo data from swarm container onto s3?

Many thanks in advance!

kotmsk
  • 455
  • 1
  • 4
  • 12

1 Answers1

1

Why not run a swarm service that runs an hourly back and then you can automate it with a script to upload where you need it, or just store it in an EBS volume. Here's a simple example using digitalocean block storage.

Bret Fisher
  • 8,164
  • 2
  • 31
  • 36