As the design specification makes clear, the source directories of Google Cloud Functions are expected to include main.py
, requirements.txt
, and __init__.py
. Additional local dependencies (i.e., code) may be specified so long as their imports are within the source directory, as described here. This precludes the importing of sibling or parent directories.
In this directory setup, main.py
can import internal.code_internal
and, if base/
has been added to the PythonPath, can import base.code_sibling
. The limitations (by design) of Cloud Functions does not allow this latter import, as only the contents of functions/f/
will be deployed to its servers. My question refers to workarounds and uses of symbolic links.
base/
__init__.py
code_sibling.py
functions/
f/
__init
main.py
requirements.txt
internal/
__init__.py
code_internal.py
In conventional Python, a symbolic link can be added to functions/f/
which points to base/
, and which then makes the contents of base/
able to be imported as though they are directly in the functions/f/
directory: in other words, as though the file functions/f/base/code_sibling.py
exists. However, this improvement does not change the deployment behavior of Cloud Functions: the symbolic link is (seemingly ignored by gcloud functions deploy
. Instead, I am finding myself directly copying the base/
directory into functions/f/
, then deploying the Cloud Function, then deleting the copied files of functions/f/base/
.
Has anyone been able to support symbolic link, or are there other workarounds that better address the situation? Thank you.
Cross-posted to functions-framework-python
with this ticket.