2

1) I have a large file (> 100k lines) that needs to be processed. I have a lot of business validation and checks against external systems for each line item. The code is being migrated from a legacy app and i just put these business logic into the AsyncitemProcessor, which also persists the data into the DB. Is this a good practise to create/save records in the ItemProcessor (in lieu of ItemWriter) ? 2) Code is ::

@Configuration
@EnableAutoConfiguration
@ComponentScan(basePackages = "com.liquidation.lpid")
@EntityScan(basePackages = "com.liquidation.lpid.entities")
@EnableTransactionManagement
public class SimpleJobConfiguration {
    @Autowired
    public JobRepository jobRepository;

    @Autowired
    private StepBuilderFactory stepBuilderFactory;

    @Autowired
    @Qualifier("myFtpSessionFactory")
    private SessionFactory myFtpSessionFactory;

    @Autowired
    public JobBuilderFactory jobBuilderFactory;


    @Bean
    public ThreadPoolTaskExecutor lpidItemTaskExecutor() {
        ThreadPoolTaskExecutor tExec = new ThreadPoolTaskExecutor();
        tExec.setCorePoolSize(10);
        tExec.setMaxPoolSize(10);
        tExec.setAllowCoreThreadTimeOut(true);
        return tExec;
    }

    @BeforeStep
    public void beforeStep(StepExecution stepExecution){
        String name = stepExecution.getStepName();
        System.out.println("name: " + name);
    }

    @Bean
    public SomeItemWriterListener someItemWriterListener(){
        return new SomeItemWriterListener();
    };

    @Bean
    @StepScope
    public FlatFileItemReader<FieldSet> lpidItemReader(@Value("#{stepExecutionContext['fileResource']}") String fileResource) {
        System.out.println("itemReader called !!!!!!!!!!! for customer data" + fileResource);

        FlatFileItemReader<FieldSet> reader = new FlatFileItemReader<FieldSet>();
        reader.setResource(new ClassPathResource("/data/stage/"+ fileResource));
        reader.setLinesToSkip(1);
        DefaultLineMapper<FieldSet> lineMapper = new DefaultLineMapper<FieldSet>();
        DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
        reader.setSkippedLinesCallback(new LineCallbackHandler() {
            public void handleLine(String line) {
                if (line != null) {
                    tokenizer.setNames(line.split(","));
                }
            }
        });
        lineMapper.setLineTokenizer(tokenizer);
        lineMapper.setFieldSetMapper(new PassThroughFieldSetMapper());

        lineMapper.afterPropertiesSet();
        reader.setLineMapper(lineMapper);
        return reader;
    }


    @Bean
    public ItemWriter<FieldSet> lpidItemWriter() {
        return new LpidItemWriter();

    }

    @Autowired
    private  MultiFileResourcePartitioner multiFileResourcePartitioner;


    @Bean
    public Step masterStep() {
        return stepBuilderFactory.get("masterStep")
                .partitioner(slaveStep().getName(), multiFileResourcePartitioner)
                .step(slaveStep())
                .gridSize(4)
                .taskExecutor(lpidItemTaskExecutor())
                .build();   
    }

    @Bean
    public ItemProcessListener<FieldSet,String> processListener(){
        return new LpidItemProcessListener();
    }


    @SuppressWarnings("unchecked")
    @Bean
    public Step slaveStep() {
        return stepBuilderFactory.get("slaveStep")
                .<FieldSet,FieldSet>chunk(5)
                .faultTolerant()
                .listener(new ChunkListener())
                .reader(lpidItemReader(null))
                .processor(asyncItemProcessor())
         .writer(asyncItemWriter()).listener(someItemWriterListener()).build();
    }

    @Bean
    public AsyncItemWriter<FieldSet> asyncItemWriter(){
        AsyncItemWriter<FieldSet> asyncItemProcessor = new AsyncItemWriter<>();
        asyncItemProcessor.setDelegate(lpidItemWriter());
        try {
            asyncItemProcessor.afterPropertiesSet();
        } catch (Exception e) {
            e.printStackTrace();
        }

        return asyncItemProcessor;
    }

    @Bean
    public ItemProcessor<FieldSet, FieldSet> processor() {
        return new lpidCheckItemProcessor();
    }


    @Bean
    public AsyncItemProcessor<FieldSet, FieldSet> asyncItemProcessor() {
        AsyncItemProcessor<FieldSet, FieldSet> asyncItemProcessor = new AsyncItemProcessor<FieldSet, FieldSet>();

        asyncItemProcessor.setDelegate(processor());
        asyncItemProcessor.setTaskExecutor(lpidItemTaskExecutor());
        try {
            asyncItemProcessor.afterPropertiesSet();
        } catch (Exception e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

        return asyncItemProcessor;
    }


    @Bean
    public Job job() throws Exception {
        return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer()).start(masterStep()).build();
    }
}

The itemwriter runs before the itemprocessor has completed. My understanding is: for every chunk, the item reader reads the data, item processor will churn through each item, and at the end of the chunk, the item writer gets called (which in my case,it does not do anything since the itemprocessor persists the data). But the itemwriter gets called before the item processor gets completed and my job never completes. What am i doing incorrectly here? (I looked at previous issues around it and the solution was to wrap the writer around the AsyncItemWriter(), which i am doing) .

Thanks Sundar

  • Why are you writing from the `ItemProcessor`. That isn't recommended. – Michael Minella Apr 11 '17 at 15:34
  • I noted the following in your reader configuration: `System.out.println("itemReader called !!!!!!!!!!! for customer data" + fileResource);` which got me wondering: How do you check that the itemProcessor is called before itemProcessor? – qtips Apr 17 '17 at 17:59
  • Did you get a solution? – DarkCrow Aug 31 '18 at 14:21

0 Answers0