1

I am using a AWS CodePipeline which fetches the source from my git repository, uses CodeBuild buildSpec to build and save output artifacts to S3 bucket, which ultimately gets deployed to Elastic BeanStalk (NodeJS Environment).

Everything works fine but I require the pipeline to copy 1 particular file from one of my AWS S3 buckets and add it to the output artifacts before deploying it to the EB

Can it be done using the buildSpec?

artifacts:
  files:
    - '**/*'
    # - How to add a file from S3 to the artifacts?
saibbyweb
  • 2,864
  • 2
  • 27
  • 48

1 Answers1

5

My recommendation is as part of the build or post_build, copy the required file from s3 into your build directory.

build:
  commands:
    - echo "Build commands"
    - aws s3 cp --region=xx-xxxx-x "s3://file/in/s3" "local-file-instance-to-include"

Then you will have the file copied from s3, available for your build, and you can add it to the artifacts output.

KiteCoder
  • 2,364
  • 1
  • 13
  • 29
  • Tried adding the above command and now I am getting this error: `COMMAND_EXECUTION_ERROR: Error while executing command: aws s3 cp ...... Reason: exit status 1` – saibbyweb Jan 23 '21 at 09:00
  • The issue was with the permissions. The CB role didn't have permission to access S3 buckets. Thanks. – saibbyweb Jan 23 '21 at 12:32
  • Worked perfectly, but since I am loading a private key, I don't want to leave the bucket open to the public, do you know how to apply policies in this approach? – nacho Jun 18 '21 at 13:44