I want to do multiple copy objects from one s3 bucket to another in asynchronous manner and also I want to maintain there status.
I have already tried using async and await on s3 boto3 library but they doing sequential copy. I have also tried using aiobotocore and aioboto3 libraries but they don't have proper copy_object API as used in boto3.
async def copy_file(taskdescriptor: list, buckets: list):
#loop = asyncio.get_event_loop()
print("taskdescriptor", taskdescriptor)
for job in taskdescriptor[0]:
print("Inside copy", job , buckets)
src_bucket = buckets[0]
dest_bucket = buckets[1]
src_path = job[0]
dest_path = job[1]
src_loc = {"Bucket": src_bucket, "Key": src_path}
#loop = asyncio.get_event_loop()
#session = aiobotocore.get_session()
async with aioboto3.client('s3') as asyncs3:
print("s3 copy from {} to {}".format(src_path, dest_path))
resp = await asyncs3.copy_object(Bucket=dest_bucket, Key=dest_path, CopySource=src_loc)
print("done ----->", resp)
print("file copy done")
Basically I want to copy multiple files in one shot, searching for quickest way to copy data from one s3 bucket to another.