0

Name: spring-cloud-dataflow-server Version: 2.5.0.BUILD-SNAPSHOT

I have a very simple task created. First run it always COMPLETES fine with NO ISSUES. If task is run again it FAILS with following error.

Task Graph Task Definition First run - no issues First run - job log Subsequent Launch of same task fails with below exception and it's a fresh run after the previous execution completed fully. If a task is run one time can't it be run again? Failed Subsequent Runs (log from Task Execution Details - Execution ID: 246)

Caused by: org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException: A job instance already exists and is complete for parameters={-spring.cloud.data.flow.taskappname=composed-task-runner, -spring.cloud.task.executionid=246, -graph=threetasks-t1 && threetasks-t2 && threetasks-t3, -spring.datasource.username=root, -spring.cloud.data.flow.platformname=default, -dataflow-server-uri=http://10.104.227.49:9393, -management.metrics.export.prometheus.enabled=true, -management.metrics.export.prometheus.rsocket.host=prometheus-proxy, -spring.datasource.url=jdbc:mysql://10.110.89.91:3306/mysql, -spring.datasource.driverClassName=org.mariadb.jdbc.Driver, -spring.datasource.password=manager, -management.metrics.export.prometheus.rsocket.port=7001, -management.metrics.export.prometheus.rsocket.enabled=true, -spring.cloud.task.name=threetasks}.  If you want to run this job again, change the parameters.
techpro
  • 71
  • 1
  • 9

1 Answers1

1

A Job instance in a Spring Batch application requires a unique Job Parameter and this is by design.

In this case, since you are using the Composed Task, you can use the property --increment-instance-enabled=true as part of the composed task definition to handle it. This property will make sure to have the Job Instance get the unique Job parameters.

You can check the list of properties supported for Composed Task Runner here

Ilayaperumal Gopinathan
  • 4,099
  • 1
  • 13
  • 12
  • In case if you want to handle the same for non Composed Task applications, you need to pass a random property as a commandline arg to the batch job or configure a `JobParametersIncrementer` in your application. There is some context here in this SCDF issue: https://github.com/spring-cloud/spring-cloud-dataflow/issues/1489 – Ilayaperumal Gopinathan May 02 '20 at 01:19
  • Thank you, Illayaperumal. It worked as per your suggestion. – techpro May 03 '20 at 21:25