I am using lambda function which triggers when a new file is added to S3 bucket. Lambda function transforms the contents of the file and places it in another S3 bucket.
when a new file 'emp.json' is added to 'sourcebucket/test' folder in the S3 bucket, a new logstream is created in the cloudwatch logs and the file is added to 'targetbucket/emp' folder.This is as expected.
when i add another file 'emp1.json' to 'sourcebucket/test' folder in the S3 bucket before 5 mins, the same logstream is appended with the logs and the existing file gets replaced in the 'targetbucket/emp' folder.Instead of appending logs to the existing logstream, can we create a new log stream?
when i add another file 'emp2.json' to 'sourcebucket/test' folder in the S3 bucket after 5 mins, then a new logstream is created in the cloudwatch logs and the file is added to 'targetbucket/emp' folder.This is also working fine
The problem is only when i add a new file to the same folder in like less than 5 mins, it is overwriting or replacing the existing file.I am new to AWS lambda.Let me know if this can be fixed.
so, the