0

I have a requirement where we receive a csv file in the form of byte stream through ECS S3 Pre-Signed url. I have to validate the data and write the validation successful and failed records to 2 different csv files and store them to ECS S3 bucket by converting them to InputStream. Also write the successful records to database and also the pre-signed urls of the inbound, success and failure files.

I'm new to Spring Batch. How should I approach this requirement?

If I choose a FlatFileItemReader to read, ItemProcessor to process the data how should I write to different files and to Database?

or

Should I create a job using Tasklets? TIA.

Vinda
  • 213
  • 3
  • 7
  • 24
  • Please check below is fine - Option 1. I can share the code snippet. Reader--->Processor(Validate and update the object with Sucess and Failure result)-->CompositeWriter(SuccessWriter+Failure Writer) – Rakesh Dec 31 '20 at 03:58
  • Above will be better option for you rather than handling in Listener validations. Please let me know code snippet is required I can share. After confirm I can share as answer – Rakesh Dec 31 '20 at 04:00
  • That will be helpful, can you please share a code snippet. Thank you! – Vinda Dec 31 '20 at 04:35

1 Answers1

1

Please find below sample code snippet . Let me know if you face any issues

 //Your InputOutPut DTO This is the key object
   Class BaseCSVDTO {
    // yourCSVMappedFields  
    private SuccessCSVObject successObject;
    private FailureCSVObject failureObject;
   }

   //Read the Files in reader as Normal better create a custom reader if you want to get more control
    @Bean
    public ItemReader<BaseCSVDTO> yourFlatFileItemReader() {
        
         //populate mapped fields automatically by Springbatch
    }

    @Bean
    public CSVProcessor csvValidationProcessor() {
        return new CSVProcessor();
    }
    
    Class CSVProcessor implements ItemProcessor<BaseCSVDTO, BaseCSVDTO> {
        @Override
        public BaseCSVDTO CSVProcessor(BaseCSVDTO eachCSVitem) throws Exception {
            //validateEachItem and put in Success or Failure Object
            //Example of Success
                SuccessCSVObject successObject = new SuccessCSVObject()
                eachCSVitem.setSuccessObject(successObject);
            //Same way for Failure object   
        }
    }

   @Bean
    public CompositeItemWriter compositeWriter() throws Exception {
        CompositeItemWriter compositeItemWriter = new CompositeItemWriter();
        List<ItemWriter> writers = new ArrayList<ItemWriter>();
        writers.add(successCSVWriter());
        writers.add(failureCSVWriter());
        compositeItemWriter.setDelegates(writers);
        return compositeItemWriter;
    }

    @Bean
    public YourItemWriter<BaseCSVDTO> successCSVWriter() {
        return new SuccessWriter();
    }

    @Bean
    public YourItemWriter<BaseCSVDTO> failureCSVWriter() {
        return new FailureWriter;
    }

    
    public class SuccessWriter implements ItemWriter<BaseCSVDTO> {
        @Override
        public void write(List<? extends BaseCSVDTO> items){
        for(BaseCSVDTO baseCSVDTO:items) {
            baseCSVDTO.getSuccessObject
          //write Success CSV 
        }
        }
    }

  public class FailureWriter implements ItemWriter<BaseCSVDTO> {
        @Override
        public void write(List<? extends BaseCSVDTO> items){
        for(BaseCSVDTO baseCSVDTO:items) {
          //write Success CSV 
          baseCSVDTO.getFailureObject
        }
        }
    }

    /// Finally Job step
    @Bean
    public Step executionStep() throws Exception {
        return stepBuilderFactory.get("executionStep").<BaseCSVDTO, BaseCSVDTO>chunk(chunkSize)
                .reader(yourFlatFileItemReader()).processor(csvValidationProcessor()).writer(compositeWriter())
                //.faultTolerant()
                //.skipLimit(skipErrorCount).skip(Exception.class)//.noSkip(FileNotFoundException.class)
                //.listener(validationListener())
                //.noRetry(Exception.class)
                //.noRollback(Exception.class)
                .build();
    }
Rakesh
  • 658
  • 6
  • 15
  • Thank you so much for sharing this code snippet but I have to convert the csv file that is created to byte stream for storing it to ecs s3 bucket so if I write to a csv file in chunks when/where should I convert the created file to byte stream? – Vinda Dec 31 '20 at 06:51
  • Also after storing these files to ecs s3 bucket I should retrieve the pre-signed url of both the files and write it to db – Vinda Dec 31 '20 at 06:53
  • 1
    In the writer you are getting as a separate java object for success and failure this java object you can write using JAVA IO bytestream or OpenCSV bytestream – Rakesh Dec 31 '20 at 06:56
  • 1
    In the SuccessWriter and failure Writer you can do lot actions based on your need if you want you can write to DB with jdbc template or create another writer and add in Composite writer . – Rakesh Dec 31 '20 at 06:58