2

I have the following function to read data from a generator source(), which yield chunks of data (the length of the chunk is not fixed). And the data need to be written to a S3 object.

import boto3

async def write_s3(s3_key):
    session = boto3.Session()
    s3 = session.resource('s3')
    s3obj = s3.Object(BUCKET, s3_key)
    mpu = s3obj.initiate_multipart_upload()
    part = mpu.Part()

    async for chunk in source():
        # How to write all the chunks yield from source() to S3 (s3_key)?
        part.Upload(body=chunk)

    part_info = { .... }
    mpu_result = mpu.complete(MultipartUpload=part_info)

How to write all the chunks yielded from source() to an S3 object? The size of each chunk may vary.

Ref: Stream large string to S3 using boto3

ca9163d9
  • 27,283
  • 64
  • 210
  • 413

0 Answers0