0

We are trying to migrate from spring xd to spring cloud data flow. We are able to schedule batch job using spring cloud data flow. But one thing I've noticed is, in addition to the job arguments supplied, the framework is also logging framework level parameters, such as :

--endpoints.jmx.unique-names=true
--endpoints.shutdown.enabled=true
--spring.datasource.username=username
--spring.datasource.url=jdbc:oracle:thin:@//ip:1521/sid
--spring.datasource.driverClassName=oracle.jdbc.driver.OracleDriver
--server.port=33333
--spring.cloud.task.name=hdp-wc-tsk
--spring.datasource.password=pass
--spring.jmx.default-domain=hdp-wc-tsk-d8412dda-fee2-4faf-9bba-f30d4f705fce
--spring.cloud.task.executionid=123

Is there anyway the above framework level parameters, which has nothing to do with the batch job, not to be logged against the job in the job repository db?

ssm75
  • 1
  • 3
  • This is potentially happening in the handshake with SCDF, SCT and Spring Batch. What's supplied in the launch-arguments are blindly passed over and Spring Batch catches and persists them. Could you please add a story to SCDF's [backlog](https://github.com/spring-cloud/spring-cloud-dataflow/issues)? We will investigate it then. – Sabby Anandan Nov 17 '17 at 15:33
  • Thanks for the response, [#1793](https://github.com/spring-cloud/spring-cloud-dataflow/issues/1793) has been created – ssm75 Nov 17 '17 at 17:08

0 Answers0