I'm pretty new with YAML files and DevOps stuff, sorry if this is a silly question. I have this node project, one folder is just for Cloud Functions (gcp), I'm creating the infrastructure in gcp with Terraform, and gcp for Cloud Functions allows you to pass the code thru cloud storage and Google Resource Repo, they don't want to use another repo so my only option is to take the cloud function code from Google Storage, someone in the team mention that I can modify the YAML file and zip the folder and upload to google storage, I tried a few solutions but it doesn't work. Do you have an idea if this is possible or another way to do it?
this is my actual YAML file
steps:
- name: 'zip'
args: ['send-emails.zip', '/cloud-functions/send-emails/index.js /cloud-functions/send-emails/package.json']
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
entrypoint: gcloud
args: ['storage', 'cp', 'send-emails.zip', 'gs://bucket']
# Build the container image
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', '${_DOCKER_IMAGE_URI}', '.']
# Push the container image to Container Registry
- name: 'gcr.io/cloud-builders/docker'
args: ['push', '${_DOCKER_IMAGE_URI}']
# Update container image to Cloud Run
# Previously, the cloud run service should be created with its own env-vars/secrets
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
entrypoint: gcloud
args:
[
'run',
'deploy',
'${_CLOUD_RUN_SERVICE_NAME}',
'--image',
'${_DOCKER_IMAGE_URI}',
'--region',
'${_CLOUD_RUN_REGION}',
]
images:
- '${_DOCKER_IMAGE_URI}'
timeout: 1200s
substitutions:
_DOCKER_IMAGE_URI: '' #default value
_CLOUD_RUN_SERVICE_NAME: '' #default value
_CLOUD_RUN_REGION: '' #default value
options:
logging: CLOUD_LOGGING_ONLY
Thanks.