I have a simple spring batch remote partitioning project running with spring cloud task. Im using spring cloud dataflow at run the spring batch. Spring boot version - 2.7.8 Spring cloud task - 2.4.5 spring cloud data flow - 2.10.1-SNAPSHOT
I'm using docker compose for spring cloud dataflow locally, and im using spring cloud local deployer as well.
I have spring batch to move data from one table to another within same database. I'm partitioning the data of 100 records into 4 partitions. I'm initially made one of the partition to fail.
After processsing. batch_job_execution job is failed and batch_step_execution one of the partition is failed. But the same is not reflected in scdf dashboard. In scdf dashboard, task and task execution is completed but job execution is failed status
Two questions:-
- How do i makesure the scdf reflect right batch status in dashboard?
- How do i restart the failed job execution?
For First question, i tried with "spring.cloud.task.batch.fail-on-job-failure=true" property in application.properties, but i get "Job must not be null nor empty" from TaskJobLauncherApplicationRunnerFactoryBean.java
For second question, i tried relauch the task using below REST API ->
curl 'http://localhost:9393/jobs/executions/1' -i -X PUT
-H 'Accept: application/json'
-d 'restart=true'
But it restarted all the partitiones