-1
image: python:3.7

pipelines:
  default:
    - step:
        name: Install and Configure Google Cloud SDK
        script:
          - export BUILD_TAG=stage-xxx:$BITBUCKET_BUILD_NUMBER
          - echo $GCLOUD_API_KEYFILE | base64 -d > ${HOME}/gcloud-service-key.json
          - apt-get update && apt-get install -y curl
          - curl https://sdk.cloud.google.com | bash -s -- --disable-prompts --install-dir=/usr/local/gcloud
          - export PATH=$PATH:/usr/local/gcloud/google-cloud-sdk/bin
          - gcloud auth activate-service-account --key-file=${HOME}/gcloud-service-key.json
          - gcloud config set project $PROJECT_ID
          - gcloud config list
          - gcloud components install kubectl
          - gcloud components install gsutil

    - step:
        name: Build Docker image and push to GCR
        services:
          - docker
        script:
          - export PATH=$PATH:/usr/local/gcloud/google-cloud-sdk/bin
          - export BUILD_TAG=stage-kredily:$BITBUCKET_BUILD_NUMBER
          - gcloud container clusters get-credentials gke-xxx-xxx-as1-service-01 --zone asia-south1-a --project prj-srv-xxx-xxx-01
          - gsutil -m rm -r gs://bkt-xxx-static-files-01/static/*
          - gsutil -m rsync -r static/ gs://bkt-xxxx-static-files-01/static/
          - docker build -t asia-south1-docker.pkg.dev/prj-srv-xxxx-beta-01/repo-xxxx-stage-as1-01/$BUILD_TAG .
          - gcloud auth configure-docker asia-south1-docker.pkg.dev
          - docker push asia-south1-docker.pkg.dev/prj-srv-xxxx-beta-01/repo-xxxx-stage-as1-01/$BUILD_TAG

    - step:
        name: Deploy to QA App Server
        image: gcr.io/google.com/cloudsdktool/google-cloud-cli:alpine
        script:
          - export PATH=$PATH:/usr/local/gcloud/google-cloud-sdk/bin
          - kubectl set image deployment/xx-app xx-app=asia-south1-docker.pkg.dev/prj-srv-xx-beta-01/repo-xx-stage-as1-01/stage-xx:latest--record --namespace=xx-beta

`

Here you can see that I have installed google cloud components at the default stage mainly gsutil and kubectl, which needs to be used in the subsequent steps, however I am always getting below error:

bash: gcloud: command not found
N1ngu
  • 2,862
  • 17
  • 35

1 Answers1

0

From the docs https://support.atlassian.com/bitbucket-cloud/docs/step-options/#The-Step-property

Each step in your pipeline will start a separate Docker container to run the commands configured in the script option.

So, it's not that pipelines are "unable to propagate commands installed on default", it's that they won't BY DESIGN.

You should never split the setup of a step elsewhere. Just cache whatever makes sense to cache so that subsequent executions are faster https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/

Also, it makes much more sense to simply use a docker image specific for the main tooling you are using. E.g. using gcr.io/google.com/cloudsdktool/google-cloud-cli instead of installing gcloud in a python image!

N1ngu
  • 2,862
  • 17
  • 35