0

I have a standalone cloud source repository, (not cloned from Github). I am using this to automate deploying of ETL pipelines . So I am folowing Google recommended guidelines, i.e committing the ETL pipeline as a .py file. The cloud build trigger associated with the Cloud source repository will run the code as mentioned in the cloudbuild.yaml file and put the resultant .py file on the composer DAG bucket. Composer will pick up this DAG and run it .

Now my question is, how do I orchestrate the CICD in dev and prod? I did not find any proper documentation to do this. So as of now I am following manual approach. If my code passes in dev, I am committing the same to the prod repo. Is there a way to do this in a better way?

Kuwali
  • 233
  • 3
  • 13
  • Can you please check if the [following link](https://codelabs.developers.google.com/codelabs/cloud-builder-gke-continuous-deploy#0) and also the following [documentation](https://cloud.google.com/docs/ci-cd) helps you ? – Rajeev Tirumalasetty Aug 27 '21 at 12:56

1 Answers1

0

Cloud Build Triggers allow you to conditionally execute a cloudbuily.yaml file on various ways. Have you tried setting up a trigger that fires only on changes to a dev branch?

Further, you can add substitutions to your trigger and use them in the cloudbuild.yaml file to, for example, name the generated artifacts based on some aspect of the input event.

See: https://cloud.google.com/build/docs/configuring-builds/substitute-variable-values and https://cloud.google.com/build/docs/configuring-builds/use-bash-and-bindings-in-substitutions

Charles Reich
  • 91
  • 1
  • 1
  • since our code is in Cloud Source repository only, we have no branch apart from master. There is no way to create a dev branch, am I wrong here? – Kuwali Aug 31 '21 at 07:34