-1

I use the following command in my Compute Engine to run a script that's stored in Cloud Storage:

gsutil cat gs://project/folder/script.sh | sh

I want to create a function that runs this command and eventually schedule to run this function, but I don't know how to do this. Does anyone know how to do this?

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
  • I'd be tempted to create a Docker image that includes the GCP SDK and hence gives you gsutil. You can then write a Node/Python/Java app that shell escapes and runs the command. You could then define the image to be run by Cloud Run. The end result would thus be "When you securely call an endpoint, your script runs". This is the same semantics as Cloud Functions but now you get control over the full environment. I don't think we can assume that gsutil is present in Cloud Functions. – Kolban Mar 22 '21 at 14:21
  • 1
    Currently, it is not possible to run shell commands inside a Google Cloud Function. – John Hanley Mar 22 '21 at 14:56

1 Answers1

3

Cloud Functions is serverless and you can't manage the runtime environment. You don't know what is installed on the runtime environment of the Cloud Functions and your can't assume that GCLOUD exists.

The solution is to use cloud Run. the behavior is very close to Cloud Functions, simply wrap your function on a webserver (I wrote my first article on that) and, in your container, install what you want, especially GCLOUD SDK (you can also use a base image with GCLOUD SDK already installed). And this time you will be able to call system binaries, because you know that they exist because you installed them!

Anyway, be careful in your script execution: the container is immutable, you can't change file, the binaries, the stored files,... I don't know the content of your script but your aren't on a VM, you are still on serverless environment, with ephemeral runtime.

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76