0

here is my cloudbuild.yaml file

- name: 'gcr.io/cloud-builders/docker'
  args: ['build', '-t', 'gcr.io/$PROJECT_ID/$_TRAINER_IMAGE_NAME:$TAG_NAME', '.']
  dir: $_PIPELINE_FOLDER/trainer_image
  
# Build the base image for lightweight components
- name: 'gcr.io/cloud-builders/docker'
  args: ['build', '-t', 'gcr.io/$PROJECT_ID/$_BASE_IMAGE_NAME:$TAG_NAME', '.']
  dir: $_PIPELINE_FOLDER/base_image

# Compile the pipeline
- name: 'gcr.io/$PROJECT_ID/kfp-cli'
  args:
  - '-c'
  - |
    dsl-compile --py $_PIPELINE_DSL --output $_PIPELINE_PACKAGE
  env:
  - 'BASE_IMAGE=gcr.io/$PROJECT_ID/$_BASE_IMAGE_NAME:$TAG_NAME'
  - 'TRAINER_IMAGE=gcr.io/$PROJECT_ID/$_TRAINER_IMAGE_NAME:$TAG_NAME'
  - 'RUNTIME_VERSION=$_RUNTIME_VERSION'
  - 'PYTHON_VERSION=$_PYTHON_VERSION'
  - 'COMPONENT_URL_SEARCH_PREFIX=$_COMPONENT_URL_SEARCH_PREFIX'
  - 'USE_KFP_SA=$_USE_KFP_SA'
  dir: $_PIPELINE_FOLDER/pipeline
  
 # Upload the pipeline
- name: 'gcr.io/$PROJECT_ID/kfp-cli'
  args:
  - '-c'
  - |
    kfp --endpoint $_ENDPOINT pipeline upload -p ${_PIPELINE_NAME}_$TAG_NAME $_PIPELINE_PACKAGE
  dir: $_PIPELINE_FOLDER/pipeline


# Push the images to Container Registry 
images: ['gcr.io/$PROJECT_ID/$_TRAINER_IMAGE_NAME:$TAG_NAME', 'gcr.io/$PROJECT_ID/$_BASE_IMAGE_NAME:$TAG_NAME']

so basically what i am trying to do is write a bash script which contains all the variables and any time when i push the changes the cloud build should automatically triggered.

desertnaut
  • 57,590
  • 26
  • 140
  • 166

2 Answers2

0

You may add another step in pipeline for the bash. Something like below.

steps:
  - name: 'bash'
    args: ['echo', 'I am bash']
Prabhu
  • 666
  • 6
  • 11
0

You seem to split the compilation and upload between two separate steps. I'm not 100% sure this works in CloudBuild (I'm not sure the files are shared).

You might want to combine those steps.

Ark-kun
  • 6,358
  • 2
  • 34
  • 70