0

I'm following this Case Study, which is similar to mine where I want to receive thousand of files in a S3 bucket and launch the batch task which will consume them.

But I'm getting:

Problem occurred while synchronizing 'bucket' to local directory; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;

I already consume this bucket using spring-cloud-starter-aws dependency in some apps.

I know the message is pretty clear, but should I have specific permissions in a bucket when I need to sync like this with Spring Cloud DataFlow?

My current Stream config is:

s3 
--spring.cloud.function.definition=s3Supplier,taskLaunchRequestFunction 
--file.consumer.mode=ref 
--s3.common.path-style-access=true 
--s3.supplier.remote-dir=mybucket 
--s3.supplier.local-dir=/scdf/infile 
--cloud.aws.credentials.accessKey=**** 
--cloud.aws.credentials.secretKey=**** 
--cloud.aws.region.static=**** 
--cloud.aws.stack.auto=false 
--task.launch.request.taskName=bill-composed-task 
| 
task-launcher-dataflow 
--spring.cloud.dataflow.client.server-uri=http://localhost:9393

Thanks in advance

Guilherme Bernardi
  • 490
  • 1
  • 6
  • 18
  • 1
    I don't know for sure, but try removing path-style-access. That maps to a setting in the AWS SDK, required for Minio, but should not be required for AWS. Also I would try removing stack.auto=false. – dturanski Jan 11 '21 at 15:00
  • Hi @dturanski I tried removing path-style and also stack.auto but no success. I decided to move to spring-batch-integration due to my short time to achieve my goal, but I'll try again because I liked a lot SCDF. – Guilherme Bernardi Jan 12 '21 at 23:27

0 Answers0