12

I am trying to sync a few large buckets on amazon S3.

When I run my S3cmd sync --recursive command I get a response saying "killed".

What does this refer to? Is there a limit on the number of files that can be synced in S3?

starball
  • 20,030
  • 7
  • 43
  • 238

2 Answers2

12

After reading around it looks like the program has memory consumption issues. In particular this can cause the OOM killer (out of memory killer) to take down the process and prevent the system from getting bogged down. A quick look at dmesg after the process is killed will generally show if this is the case or not.

With that in mind I would ensure you're on the latest release, which notes memory consumption issues being solved in the release notes.

cwgem
  • 2,739
  • 19
  • 15
  • Right on the button. However, after installing s3cmd from the latest source the process is still getting killed. I might have to bump up the memory on my EC2 instance or take one of these steps: http://stackoverflow.com/a/15266865/242426 – plainjimbo May 21 '13 at 20:41
  • I was running into this as well, but I was running in a Vagrant. With the tip that it was a memory issue, I checked my vagrant settings and found that I was running the vm with only 256mb. Once I bumped that up the problem was fixed. – SunSparc May 05 '14 at 16:50
  • Same here, increasing of memory (in my case from 512MB to 2GB) resolved the issue. – Pavel Jun 22 '14 at 03:40
3

Old question, but I would like to say that, before you try to add more physical memory or increase vm memory, try just adding more swap.

I did this with 4 servers (ubuntu and centos) with low ram (700MB total, only 15MB available) and it is working fine now.

Edgar
  • 1,097
  • 1
  • 15
  • 25