Imagine a basic Lambda Function like this:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""This is an example Lambda Function which imports MyFancyClass from the
backend.
"""
from typing import Any, Dict
from project_name.module import MyFancyClass
def lambda_handler(event: Dict[str, Any], context: Any) -> Dict[str, Any]:
"""Standard Lambda event handler."""
del context
class_instance = MyFancyClass('some initial value')
result = class_instance.special_method(event)
return result
I could not manage to find an easy explanation how the code setup in AWS works such that you provide a backend code repo from where the Lambda function can import Python modules.
I read about Lambda Layers, but deploying all dependencies as a ZIP-file every time I make a change in the backend does not seem to be the smooth process I envision.
I much rather believe there must be a way to set up a CI/CD - pipeline on AWS where the code repo is managed AWS CodeCommit. With that, you push changes to the remote backend repo, it gets updated and finally, when the Lambda function is executed it accesses the up-to-date backend code.
But maybe, zipping the backend code and deploying it as a Lambda layer is really the only way. And if so, I would like to know how this can be combined with a CI/CD-pipeline + repo in the most convenient way.