1

Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long.

  • Where is this limitation documented?
  • How would I overcome it using Python?

Using boto3 library to PUT files to AWS S3

def main(myblob: func.InputStream):
        logging.info(f"Python blob trigger function processed blob \n"
                                 f"Name: {myblob.name}\n"
                                 f"Blob Size: {myblob.length} bytes")

        myblobBytes = myblob.read()

        fileName = pathlib.Path(myblob.name).name

        s3 = boto3.resource(
                's3',
                aws_access_key_id="youguessedit",
                aws_secret_access_key="noyoudidnt",
        )

        response=s3.Bucket(bucketName).put_object(Key="folder/" + fileName, 
        Body=myblobBytes, ContentMD5=md5Checksum)
        
        response.wait_until_exists()

enter image description here

ericOnline
  • 1,586
  • 1
  • 19
  • 54

1 Answers1

0

Changed boto3 from put_object to upload_fileobj and setup TransferConfig for multipart_threshold=1024*25, max_concurrency=10, multipart_chunksize=1024*25, use_threads=True.

Rips now!

Able to transfer 2GB in 89secs! Not bad.

ericOnline
  • 1,586
  • 1
  • 19
  • 54