I use spring cloud dataflow deployed to pivotal cloud foundry, to run spring batch jobs as spring cloud tasks, and the jobs require aws credentials to access an s3 bucket.
I've tried passing the aws credentials as task properties, but the credentials are showing up in the task's log files as arguments or properties. (https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#spring-cloud-dataflow-global-properties)
For now, I am manually setting the credentials as env variables in pcf after each deployment, but I'm trying to automate this. The tasks aren't deployed until the tasks are actually launched, so on a deployment I have to launch the task, then wait for it to fail due to missing credentials, then set the env variable credentials with the cf
cli. How so I provide these credentials, without them showing in the pcf app's logs?
I've also explored using vault and spring cloud config, but again, I would need to pass credentials to the task to access spring cloud config.
Thanks!