0

As the design specification makes clear, the source directories of Google Cloud Functions are expected to include main.py, requirements.txt, and __init__.py. Additional local dependencies (i.e., code) may be specified so long as their imports are within the source directory, as described here. This precludes the importing of sibling or parent directories.

In this directory setup, main.py can import internal.code_internal and, if base/ has been added to the PythonPath, can import base.code_sibling. The limitations (by design) of Cloud Functions does not allow this latter import, as only the contents of functions/f/ will be deployed to its servers. My question refers to workarounds and uses of symbolic links.

base/
   __init__.py
   code_sibling.py
functions/
   f/
      __init
      main.py
      requirements.txt
      internal/
         __init__.py
         code_internal.py

In conventional Python, a symbolic link can be added to functions/f/ which points to base/, and which then makes the contents of base/ able to be imported as though they are directly in the functions/f/ directory: in other words, as though the file functions/f/base/code_sibling.py exists. However, this improvement does not change the deployment behavior of Cloud Functions: the symbolic link is (seemingly ignored by gcloud functions deploy. Instead, I am finding myself directly copying the base/ directory into functions/f/, then deploying the Cloud Function, then deleting the copied files of functions/f/base/.

Has anyone been able to support symbolic link, or are there other workarounds that better address the situation? Thank you.

Cross-posted to functions-framework-python with this ticket.

David Bernat
  • 324
  • 2
  • 11
  • I think I would be tempted to package `base` up and include the wheel in a dist folder under f. – JonSG Aug 08 '23 at 14:38
  • Not the solution, in this case. Tempted, yes. But packaging always requires hosting on external repositories, or a lot more overhead for incremental changes. – David Bernat Aug 08 '23 at 14:49

1 Answers1

1

Cloud Functions uses a different file system than the local file system. They don't support symbolic links or certain file operations because of their read-only file system. You can get around this by copying the files you need directly into the Cloud Function source directory. This is one option you have.

Alternatively, you can use the gcloud functions deploy command with the --runtime\_file flag. This flag lets you specify a file containing configuration details. In this file, you can set the PythonPath to include the directory you want to import from.

For example:

gcloud functions deploy my-function --runtime python37 --runtime_file runtime.yaml

In the runtime.yaml file, you would have:

pythonPath:
  - functions/f/base

This tells Cloud Functions to search for imports in the functions/f/base directory. Which solution to choose depends on how many files you need to import and how often they change.

For more reference:

An object in Google Cloud Storage which acts as a "redirect" or "symlink"

gcloud functions runtime list

Julia
  • 512
  • 6
  • Thank you for bringing to my attention the `--runtime\_file` flag. That does appear to be the designed solution for this problem. One caveat, your discussion of the Cloud Function using a different filesystem is not accurate in the context you are describing this. The entirety of my discussion is using the local filesystem, before its export to the Cloud (different) filesystem; the deploy command simply does not respect symlinks when searching for files, whereas native Python does. The `--runtime\_file` appears to be the engineered intentional solution; copying (which I do) works as a kludge. – David Bernat Aug 14 '23 at 15:23