0

I have a S3 bucket mount as a volume in one EC2 instance with S3FS. I've created with PHP a directory structure there with more than 20GB now.

By the moment S3FS is exhausting instance memory and uploading is really slow so I want to move all the files to an EBS attached to the same instance.

I've tried S3CMD but there are some incompatibilities since S3FS creates zero sized objects in the bucket with the same name as directories.

Also tried writing a script to recursively copy the structure skipping those zero sized objects.

None worked.

Have anyone tried to do this? Thanks in advance for your help.

hernangarcia
  • 584
  • 2
  • 6
  • 14

1 Answers1

0

@hernangarcia Dont make things complicated, use recursive wget that wget -r followed by the url of the bucket end point. You can download all the contents to a EBS volume. Also my suggestion is not to store all those files which is like 20 GB on the root volume of the instance, instead attach another volume to it and then store all those file in that extra volume and if you have a High IOPS volume for it so that operations will be faster.

Jeevan Dongre
  • 4,627
  • 13
  • 67
  • 129