0

I'm trying to get a Gitlab cicd deploy pipeline to automatically copy a Gitlab repo to a public S3 bucket.

I've setup masked environment variables in GitLab with the AWS access and secret key and I have a .gitlab-ci.yml which uses python to install the AWS CLI. I then use AWS configure to set up the AWS CLI and kick off the deploy job.

The error I get is: upload failed: .git/logs/HEAD to s3://wildrides-mt42/.git/logs/HEAD An error occurred (AuthorizationHeaderMalformed) when calling the PutObject operation: The authorization header is malformed; a non-empty Access Key (AKID) must be provided in the credential.

The .gitlab-ci.yml is below

variables:
  S3_BUCKET_NAME: "wildrides-mt42"
deploy:
  image: python:latest
  script:
  - pip install awscli
  - aws configure set aws_access_key_id "$AWS_ACCESS_KEY_ID"
  - aws configure set aws_secret_access_key "$AWS_SECRET_ACCESS_KEY"
  - aws configure set default_region_name "eu-west-2"
  - aws configure set default_output_format "json"
  - aws configure set aws_profile "default"
  - aws configure set AWS_DEFAULT_PROFILE "default"
  - aws s3 cp ./ s3://$S3_BUCKET_NAME/ --recursive
MarshallT8
  • 21
  • 3

1 Answers1

2

SOLVED - The problem was the variables in Gitlab project had to have the same case as the variables called in the script. Originally this was not the case.

MarshallT8
  • 21
  • 3