4

I'm currently using SAM for some of my serverless applications. I also have "simpler" lambda functions that don't require an API Gateway or a more complicated integration with AWS services.

Some of these functions include S3 and DynamoDB trigger functions, or some functions that are just run periodically. I describe them as simple because they are not very large in size and don't use many services.

Because of this, I don't like the idea of using SAM for them. I would like to be able to test my functions locally still in a similar way that SAM allows. Ideally, I'd like to group all of my simple lambda functions into one git repository, then I can pull down the repo, edit, test, and deploy the new version of a function. This seems complicated with SAM. I want to avoid having many large repos for each function, which I think would end up being the case with SAM on these. I don't mind having a separate repo for larger projects, it's these small ones that are giving me a hard time.

Is there a good way to manage these small lambda functions? Or do I need to just accept using SAM for them?

Alex DeCamillo
  • 345
  • 4
  • 18
  • Hey Alex - good question that I'd like to see the answer to as well; I've got similiar questions/thoughts as you. One thing I've tried (not sure if it would work for you), is to bundle sets of similiar and small lambda together into a single source file, with different lambda handlers defined. That way groups of lambdas point to the same source file, but use different handlers (functions) in that file. For some of my use cases, that makes managing groups of separate small lambdas easier than separate source files for each one – Richard Jan 07 '20 at 20:14

2 Answers2

3

AWS SAM is perfect for that use case! If you compare it to CloudFormation it allows you to write Lambda functions and their triggers in a much more compact way. I'm not sure what you consider complicated, but just have a look at the following example. This is a single AWS SAM template, which defines two AWS Lambda functions, one triggered by S3 events the other one with a scheduled execution every five minutes:

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Globals:
  Function:
    Runtime: python3.8
Resources:
  Bucket:
    Type: AWS::S3::Bucket
  ProcessorFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: s3_processor.handler
      CodeUri: src/
      Policies: AmazonS3ReadOnlyAccess
      Events:
        PhotoUpload:
          Type: S3
          Properties:
            Bucket: !Ref Bucket
            Events: s3:ObjectCreated:*
  ScheduledFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: scheduled_function.handler
      CodeUri: src/
      Events:
        Timer:
          Type: Schedule
          Properties:
            Schedule: rate(5 minutes)

I won't include how the same definition would look like with pure CloudFormation, as it would be more than hundred lines.

You can deploy both functions using a single CloudFormation stack, which appears to me what you desire to do.

As for the code, you can see I used two different handlers, so you'd simply a s3_processor.py and a scheduled_function.py in your src/-directory and you're good to go. You can keep all of that easily in a single git-repository and enjoy the benefits of AWS SAM's local testing abilities.

If you want to have a separate CloudFormation stack per AWS Lambda function you can simply create a sub-directory in your git-repository for each AWS Lambda function.

Dunedan
  • 7,848
  • 6
  • 42
  • 52
2

You must give a look at serverless framework, 'cause as you, I don't like so much of SAM Templates, for me serverless framework is a faster and easy way to work on scenarios like this one you describe.

https://serverless.com/

I'm using it on some of my projects, it´s very simple and functional.

Richard Lee
  • 2,136
  • 2
  • 25
  • 33