we are running GitLab installed in our Kubernetes Cluster, using rook-ceph Rados-Gateway as S3 Storage backend. We want to use the backup-utility delivered in the tools container from gitlab. As backup target we configured an external minio Instance. When using the backup-utility, this error messages occurs:
Bucket not found: gitlab-registry-bucket. Skipping backup of registry ...
Bucket not found: gitlab-uploads-bucket. Skipping backup of uploads ...
Bucket not found: gitlab-artifacts-bucket. Skipping backup of artifacts ...
Bucket not found: gitlab-lfs-bucket. Skipping backup of lfs ...
Bucket not found: gitlab-packages-bucket. Skipping backup of packages ...
Bucket not found: gitlab-mr-diffs. Skipping backup of external_diffs ...
Bucket not found: gitlab-terraform-state. Skipping backup of terraform_state ...
Bucket not found: gitlab-pages-bucket. Skipping backup of pages ...
When I'm executing s3cmd ls
, I only see the two Backup Buckets on our minio Instance, not the "source" Buckets.
Can someone tell me, how to configure the backup-utility or the s3cmd so it can access both, the Rados-Gateway for the Source Buckets and the minio as Backup Target?
I have tried to insert multiple connections into the .s3cfg
File like this:
[target]
host_base = file01.xxx.xxx:80
host_bucket = file01.xxx.xxx:80
use_https = false
bucket_location = us-east-1
access_key = xxx
secret_key = xxx
[source]
host_base = s3.xxx.xxx:80
host_bucket = s3.xxx.xxx:80
use_https = false
bucket_location = us-east-1
access_key = xxx
secret_key = xxx
but that did not show any buckets from the Target when using s3cmd ls
.