0

Using s3fs, I am uploading a file to the already created s3 bucket (not deleting the bucket). On execution, the following error is thrown:

[Operation Aborted]: A conflicting conditional operation is currently in progress against this resource.

However, I would just like to dump the pickle file into the already existing bucket rather than creating a bucket for every dump.

Could not find a helpful answer in this regard.

Roxy
  • 1,015
  • 7
  • 20

2 Answers2

0

This mean the bucket was queued for deletion.

You must wait until its deleted until you can re-create it and upload

aws docs

  • If you read me right, the bucket is neither deleted nor queued for deletion. It is always existing. – Roxy Jul 15 '21 at 16:36
  • Sorry have you double checked the public access permission? – chrishollinworth Jul 15 '21 at 17:49
  • Yes, permission is available. My tasks are also getting stored initially but updating and saving my tasks on frequent attempts throws this error. – Roxy Jul 15 '21 at 18:45
0

This was due to the wrapping of fsspec over s3fs which had a conflicting mk_dir argument. This was trying to create the bucket even after its existence in AWS.

Instead removed the the fsspec and directly used the s3fs module.

import s3fs
import pickle

file='abc.pkl'
s3=s3fs.S3FileSystem()
with s3.open(f's3:///{bucket_name}/{file}', 'wb') as f:
    pickle.dump('data_to_be_written', f)
Roxy
  • 1,015
  • 7
  • 20