0

We are using a mono repository(GCP source repository) which consist of one folder for each cloud function(Cloud function's source code). What we are planning to do is implement a CI/CD pipeline using Cloud build but I would like to know if there is a way to make settings in a such a manner that if I make change in one particular function's source code and commit it then only that code is deployed.

  • we have about 50 different codes (it makes 50 folders inside our main repo)
  • Each folder consist of requirement.txt , the required .json files and the main.py

I am a newbie to this and implementing CI CD for the first time, pardon me if I am not able to explain my problem in a proper way , any suggestion would be of great help.

Thanks.

UPDATE So as of now by using the below suggested .yaml file am my build step is completing but no function is getting created and I am getting an error in the build logs which am posting below this.

My YAML FILE

- name: 'gcr.io/cloud-builders/gcloud'
  entrypoint: 'bash'
  args:
    - -c
    - |    
      for d in $(git diff --name-only --diff-filter=AMDR @~..@ | cut -d'/' -f 1);
      do
        echo $d;
        cd $d
        gcloud functions deploy $d --region=us-central1 --runtime=python37 --trigger-http 
        cd ..
      done

FAIL LOG though the step is showing success=(green) -


FETCHSOURCE
Initialized empty Git repository in /workspace/.git/
From https://source.developers.google.com/p/xyz/r/testRep
 * branch            2f78b61ea0cc45efc3e25570fe4a08707 -> FETCH_HEAD
HEAD is now at 2fb61 testing
BUILD
Already have image (with digest): gcr.io/cloud-builders/gcloud
fatal: ambiguous argument '@~..@': unknown revision or path not in the working tree.
Use '--' to separate paths from revisions, like this:
'git <command> [<revision>...] -- [<file>...]'
PUSH
DONE

1 Answers1

0

I don't know if your deployment are the same for each function. I take as hypothesis as yes. + the name of the function is the same as the directory one.

So, the idea is to extract the directory that have changed and to loop on them and then to perform an action

steps:
  - name: 'gcr.io/cloud-builders/gcloud'
    entrypoint: 'bash'
    args:
      - -c
      - |    
        for d in $(git diff --name-only --diff-filter=AMDR @~..@ | cut -d'/' -f 1);
        do
          echo $d;
          cd $d
          gecloud functions deploy $d --region=.... --runtime=.... --trigger-http
          cd ..
        done

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76
  • This seems to be working only error am facing is that even after using --allow-unauthenticated , it is still deploying my function as a private function and if I simply use cmd and there use the gcloud functions deploy... command it is deploying it perfectly the way I want(public) , what could be the issue? – Akshay Pant Jan 28 '21 at 16:40
  • Oh, yes, I didn't put all the parameter that you can put on your Cloud Function. Maybe a service account, an allow-unauthenticated param, maybe a Cloud SQL or VPC connector,... update the deploy command according with your requirements! – guillaume blaquiere Jan 28 '21 at 19:20
  • I get it and read about all the parameters in the documentation, seems like the documentation is not clear enough, so according to it I used --allow-unauthenticated which should deploy a new function as a public one but it doesnt do that, so where am I going wrong or is it like when I deploy from cmd it uses another account and when in CI/CD via cloud build it uses different account, could you please guide me to right path? – Akshay Pant Jan 29 '21 at 05:17
  • be sure to be on the right project. To check the allow-unauthencated applied value, you can perform a `gcloud functions get-iam-policy ` and you should see `allUsers` granted on the role `roles/cloudfunctions.invoker` – guillaume blaquiere Jan 29 '21 at 08:00
  • Thanks mate, Though I figured it out , it was due to cloud build not having cloud functions Admin role, it had Cloud function Developer which doesn't give cloud build SetIAMPolicy permission. – Akshay Pant Jan 29 '21 at 09:48
  • 1
    Thanks though, you were really helpful , would love to connect with you over Linkedin. – Akshay Pant Jan 29 '21 at 09:51