0

I'm trying to launch a lambda function for local development inside a docker container using AWS SAM. The gotcha here is I'm running on a Macbook using a M1 chipset and I have a base container which must be executed on x86_64 chips (Oracle thick client requirement).

I was able to get the container to build using colima, and I have a default colima VM running as below:

PROFILE    STATUS     ARCH      CPUS    MEMORY    DISK     RUNTIME    ADDRESS
default    Running    x86_64    4       12GiB     60GiB    docker

This looks good and I can startup and execute other containers as expected.

The problem I"m facing is when running a container through AWS SAM its failing with:

[ERROR] (rapid) Init failed error=Runtime exited with error: signal: illegal instruction (core dumped) InvokeID=

Which looks to be related to not enough memory.

I'm launching this using a simple configure template.yml file of

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31

Globals:
  Function:
    Timeout: 360
    Tracing: Active

Resources:
  Function:
    Type: AWS::Serverless::Function
    Properties:
      PackageType: Image
      Architectures:
        - x86_64

    Metadata:
      Dockerfile: Dockerfile.local
      DockerContext: .

I'm launching this through the Pycharm AWS Toolkit plugin. The question is how do I provide more memory to the container which is running the function?

I should note that this works flawlessly when running on a native x86_64 chip, and only uses about 1GB of memory.

I'm sure I've seen at one point in the AWS Documentation a Metadata.DockerArgs or something similar but I can no longer find reference to it in the documentation.

Chris
  • 3,437
  • 6
  • 40
  • 73
  • Hello! Can you please try adding a new `MemorySize` property with value `1024` for your `AWS::Serverless::Function`? i.e. `MemorySize: 1024` under `Properties:`; the toolkit *should* pick up on that. – Ermiya Eskandary Dec 16 '22 at 00:29

0 Answers0