2

I'm looking to run a gcloud command as part of one of my jobs. Of course, when I initially ran the job I got the error 'command not found'. CircleCI suggests using the gcloud-cli orb to install and initialise the gcloud cli.

My workflow looks like this:

workflows:
  build and deploy:
    jobs:
      - gcp-cli/install_and_initialize_cli:
          gcloud-service-key: insert_key_as_env_variable
          google-project-id: my_project_id
          google-compute-zone: my_compute_zone
      - build_job:
          requires:
            - gcp-cli/install_and_initialize_cli

The gcp-cli/install_and_initialize_cli step works perfectly well but when I run the build_job it says gcloud command not found. I assumed that running the gcp-cli orb would make the gcloud-cli available for all downstream jobs.

Is there a way to make the gcloud-cli available to downstream jobs? I have tried to persist/attach workspaces but with no success (doesn't mean this isn't a possible solution). The other possible solution is to find a way to run the gcp-cli orb as part of my build_job, but I can't quite figure out how to do that either.

For reference my (very stripped down) build_job:

build_job:
    docker:
      - image: circleci/node
    steps:
      - run: gcloud auth configure-docker // FAILS HERE - moved to top on purpose

      - checkout

      - restore_cache:
          keys:
            - v1-dependencies-{{ checksum "package.json" }}
            # fallback to using the latest cache if no exact match is found
            - v1-dependencies-

      - run:
          name: Install Docker Compose
          command: |
            curl -L https://github.com/docker/compose/releases/download/1.19.0/docker-compose-`uname -s`-`uname -m` > ~/docker-compose
            chmod +x ~/docker-compose
            sudo mv ~/docker-compose /usr/local/bin/docker-compose

      - setup_remote_docker

....
Grant
  • 811
  • 1
  • 8
  • 18

2 Answers2

3

I just had to work through this myself for a hobby project. I think the gcp-cli/install_and_initialize_cli job listed in the orb examples is a red-herring; as you noticed it will run in its own executor and disappear when you hit the next job in the workflow.

An easy way to work around this problem is to a run the gcp-cli/install command as a step in your build_job. Once you've got the gcloud cli installed you can run the appropriate auth and deploy commands. In your case you'd do something like:

build_job:
docker:
  - image: circleci/node
steps:
   - gcp-cli/install
   # do gcloud stuff here

Hope that helps!

Jama22
  • 957
  • 1
  • 6
  • 20
0

Assuming you are using circleci/node as a base image in all jobs, and assuming you want gcloud to be available in all of them, I would do the following:

Create a new pipeline to build a Docker image, so that when you push to it in version control, it builds the image and pushes the image to a Docker registry. Use FROM circleci/node at the start of the Dockerfile and then install gcloud as you would normally in Linux. Tag it with the URL of your registry, e.g. like registry.gitlab.com/grant-isdale/gcloud-node, so that you can push and pull it (substituting your username and registry name as appropriate, of course).

Then, in your CircleCI jobs, use registry.gitlab.com/grant-isdale/gcloud-node instead of circleci/node. You can add authentication details here if the registry requires authentication.

halfer
  • 19,824
  • 17
  • 99
  • 186