Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long.
- Where is this limitation documented?
- How would I overcome it using Python?
Using boto3
library to PUT files to AWS S3
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
myblobBytes = myblob.read()
fileName = pathlib.Path(myblob.name).name
s3 = boto3.resource(
's3',
aws_access_key_id="youguessedit",
aws_secret_access_key="noyoudidnt",
)
response=s3.Bucket(bucketName).put_object(Key="folder/" + fileName,
Body=myblobBytes, ContentMD5=md5Checksum)
response.wait_until_exists()