0

I'm trying to stream a file from my requests.get to my s3 using smart-open within my lambda. My lambda role has proper access to the s3, needed action of "s3:GetObject","s3:PutObject", and kms key to decrypted bucket.

I'm getting error message the bucket 'my-bucket-xyz' does not exist, or is forbidden for access (ClientError('An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied'

My hunch is something is wrong with the way I'm using smart-open, but not sure and just starting using smart-open today.

My code:

def download_report(baseURL):

    bucket = "my-bucket-xyz"

    response = requests.get(baseURL, stream=True)
    s3url = "s3://" + bucket + '/reports.txt'
    with open(s3url, 'w', transport_params = {'client_kwargs': {'S3.Client.create_multipart_upload': { 'ServerSideEncryption': 'aws:kms'}},'client': boto3.client('s3')}) as fout:
        
       fout.write(response.content)

    return
John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
Jack
  • 5
  • 4
  • please check if the s3 bucket and lambda are in the same region and also lambda is not created in a VPC. – Sri Mar 02 '23 at 00:26
  • Both are in same region, not in a vpc – Jack Mar 02 '23 at 00:41
  • The smart-open documentation page shows an example where a `session` object is created with access credentials, which is then used in `transport_params`. However, I would assume that it would also be able to extract credentials from metadata generated by the IAM Role assigned to the AWS Lambda function. However, it might be worth a try using that method just to 'force' some known-good credentials. I presume you have configured the IAM Role for access to the S3 bucket? – John Rotenstein Mar 02 '23 at 02:22

0 Answers0