0

I am reading an S3 object with AWS Lambda using the AWS Lambda C++ runtime. I use this function:

void s3read(const std::string& bucket, const std::string& key, const std::string& filename_local) {
    Aws::Client::ClientConfiguration client_conf;
    client_conf.region = Aws::Environment::GetEnv("AWS_REGION");
    client_conf.caFile = "/etc/pki/tls/certs/ca-bundle.crt";
    Aws::S3::S3Client s3_client(client_conf);
    Aws::S3::Model::GetObjectRequest object_request;
    object_request.WithBucket(bucket.c_str()).WithKey(key.c_str());
    auto get_object_outcome = s3_client.GetObject(object_request);
    if(get_object_outcome.IsSuccess()) {
        auto &retrieved_file = get_object_outcome.GetResultWithOwnership().GetBody();
        Aws::OFStream local_file;
        local_file.open(filename_local.c_str(), std::ios::in | std::ios::out | std::ios::binary);
        local_file << retrieved_file.rdbuf(); // Leak
        local_file.close();
    };
    std::remove(filename_local.c_str()); // For leak testing purposes
};

It works fine. However, the Max Memory Used keeps growing if I call the Lambda repeatedly. The leak seems to come from rdbuf(), but I don't know how to solve it.

Update: There is no leak when filename_local is /dev/null.

Medical physicist
  • 2,510
  • 4
  • 34
  • 51

0 Answers0