0

I have an azure function service in kubernetes environment which is used to read and write the file into azure datalake storage gen2. When a new file written in the blob storage trigger occurred and read the file. And make some changes in file and write back to the storage. When multiple files are changes at a time multiple pods are generated in kubernetes and access the files. Do to the difference ETAG i got error from azure service

The condition specified using HTTP conditional header(s) is not met.
ErrorCode:ConditionNotMet 

What is the best way to avoid the ETAG change in read and write operation?

Below is my code Read binding configuration trigger binding link

def main(blobdata: func.InputStream):
    print(blobdata.name)
    

Write operation

def set_file_content(self, file_path: str, data: Union[str, bytes, bytearray]) -> None:
        file = DataLakeFileClient(
            account_url=self.account_url,
            credential=self.account_key,
            file_system_name=self.fs_name,
            file_path=file_path) 
        file.upload_data(data, length=len(data), overwrite=True)

I see some article based on the Etag configuration,

Azure Blob Error: StorageException: The condition specified using HTTP conditional header(s) is not met

How do i set Etag in read and write operation?

I tried to set in write operation as below, But something is missing.Got an error condition not specified

file.upload_data(data, length=len(data), overwrite=True,etag="*",match_condition="IfPresent")
galiylama
  • 45
  • 2
  • 7

0 Answers0