I am trying to make a sample application on parallel step execution in java configuration file but get perplexed that how many files(job repository,job launcher and execution etc.) are being configured and initialized and if configured then how? Simply I need a sample application to clarify the basics of parallel execution of steps in a job.
Asked
Active
Viewed 1.7k times
3 Answers
13
Here's an example of using splits via java config. In this example, flows 1 and 2 will be executed in parallel:
@Configuration
public class BatchConfiguration {
@Autowired
private JobBuilderFactory jobBuilderFactory;
@Autowired
private StepBuilderFactory stepBuilderFactory;
@Bean
public Tasklet tasklet() {
return new CountingTasklet();
}
@Bean
public Flow flow1() {
return new FlowBuilder<Flow>("flow1")
.start(stepBuilderFactory.get("step1")
.tasklet(tasklet()).build())
.build();
}
@Bean
public Flow flow2() {
return new FlowBuilder<Flow>("flow2")
.start(stepBuilderFactory.get("step2")
.tasklet(tasklet()).build())
.next(stepBuilderFactory.get("step3")
.tasklet(tasklet()).build())
.build();
}
@Bean
public Job job() {
return jobBuilderFactory.get("job")
.start(flow1())
.split(new SimpleAsyncTaskExecutor()).add(flow2())
.end()
.build();
}
public static class CountingTasklet implements Tasklet {
@Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
System.out.println(String.format("%s has been executed on thread %s", chunkContext.getStepContext().getStepName(), Thread.currentThread().getName()));
return RepeatStatus.FINISHED;
}
}
}

Michael Minella
- 20,843
- 4
- 55
- 67
-
Yes. Because I only want to know that how can multiple steps run in parallel. So It gives me an idea. – maddy Jun 03 '16 at 12:26
-
@MichaelMinella I am having a scenario where the number of flows depends on number of rows in a table. So each flow should be running taking each row as input. Is there a way of achieving this? – Ganapathi Basimsetti Feb 12 '18 at 09:42
-
This is the answer ! Also you do not need to set tasklet if you want. It will work without tasklet in each step also – Nitish Kumar Feb 17 '18 at 10:06
-
@NitishKumar Is your comment response to mine? – Ganapathi Basimsetti Feb 20 '18 at 03:24
-
@Ganapathi004 No – Nitish Kumar Feb 22 '18 at 13:11
-
Could you provide example how execute step1 and after its termination start step2 and step3 in parallel ? – gstackoverflow Aug 07 '19 at 13:44
8
Suppose you have steps, A,B1,B2,B3,C. You want to run B1,B2 & B3 in parallel. You first need to create sub-flows for them and then add to one flow with SimpleAsyncTaskExecutor():
@Bean
public Job job()
{
final Flow flowB1 = new FlowBuilder<Flow>("subflowb1").from(stepb1()).end();
final Flow flowB2 = new FlowBuilder<Flow>("subflowb2").from(stepb2()).end();
final Flow flowB3 = new FlowBuilder<Flow>("subflowb3").from(stepb3()).end();
final Flow splitFlow = new FlowBuilder<Flow>("splitFlow")
.start(flowB1)
.split(new SimpleAsyncTaskExecutor())
.add(flowB2, flowB3).build();
return jobBuilderFactory
.flow(stepA())
.next(splitFlow)
.next(stepC())
.end()
.build();
}

Ganapathi Basimsetti
- 174
- 3
- 18

Nikhil Pareek
- 734
- 9
- 24
-
the syntax is a bit confusing: does invoking flowB1 in start, then calling split and then add ensure all 3 run in parallel or just B2 and B3? – IcedDante Apr 27 '21 at 23:27
0
here is the basic parallel step execution on different data set, basically you have to provide a Partitioner which will create seprate context for each step and based on context you can work on its data set.
<batch:job id="myJob" job-repository="jobRepository">
<batch:step id="master">
<batch:partition step="step1" partitioner="stepPartitioner ">
<batch:handler grid-size="4" task-executor="taskExecutor"/>
</batch:partition>
</batch:step>
</batch:job>
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="myReader" processor="myProcessor" writer="myWriter"
commit-interval="10"/>
</batch:tasklet>
</batch:step>
public class stepPartitioner implements Partitioner {
@Autowired
DaoInterface daoInterface;
@Override
public Map<String, ExecutionContext> partition(int i) {
Map<String, ExecutionContext> result = new HashMap<>();
List<String> keys= daoInterface.getUniqueKeyForStep();
for(String key: keys){
ExecutionContext executionContext = new ExecutionContext();
executionContext.putString("key", key);
result.put(key,executionContext);
}
return result;
}
}

sandeep
- 996
- 2
- 11
- 22