2

I am a newbie to spring batch, i would like to find the perfect approach to use with the use case as shown here:

I have multiple csv files that i would like to store them on Memory ( as Collection Store .. i.e List Map) then i would like to use/refer to them on next steps/jobs for my logic business.

Let's take one example an Object XX stored on Map with an ItemWriter.

Object XX Model

public class Object {

private int x;
private int y;
// getters setters
} 

the itemReader for Object X

public class ObjectItemReader extends FlatFileItemReader<Object> {

    public ObjectItemReader() {
        this.setResource(new ClassPathResource("xxx.csv"));      
        this.setLineMapper(new DefaultLineMapper<Object>() {{
            setLineTokenizer(new DelimitedLineTokenizer() {{
                setNames(new String[] { "x", "y" });
                setDelimiter(DELIMITER_TAB);
            }});
            setFieldSetMapper(new BeanWrapperFieldSetMapper<Object>() {{
                setTargetType(Object.class);
            }});
        }});
    }
}

the ObjectWriter

public class ObjectItemWriter implements ItemWriter<Object> {

    private Map<Long , Object> objectMap;

    public ObjectItemWriter() {
        System.out.println("Map Store is created ");
        objectMap= new HashMap<Long , Object>();
    }

    @Override
    public void write(List<? extends Object> items) throws Exception {
        for (Object depot : items) {
            objectMap.put(depot.getX(), depot);
        }
    }

    public Map<Long , Object> getobjectMap() {
        return objectMap;
    }
}

as you can see all records are stored in Map with the itemWriter, i made a test with a simple tasklet to access this Map on other steps

public class TaskletStep implements Tasklet{

    @Autowired
    private ObjectItemWriter objectItemWriter;

    @Override
    public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {

        System.out.println(objectItemWriter.getobjectMap().size());

        return null;
    }
}

My questions,

is there another way / best way to store all csv files on memory with using only ItemReader or ItemProcessor as it's a simple getting data from files to Map?

Is that the itemWriter is an essential step to store these filed on Map?

Sabir Khan
  • 9,826
  • 7
  • 45
  • 98
Feres.o
  • 283
  • 1
  • 4
  • 16

1 Answers1

1

In Spring Batch's chunk oriented steps, defining a reader & writer ( as part of a step ) are mandatory but processor is optional. See here

and then you can always choose to do - NOTHING in a component and can always do whatever you want in a component irrespective of the name ( being reader, processor or writer ) .

Having said that, you have not specified as why you want to populate map in reader or processor & not in writer? i.e. what specific issues you face by populating map in writer?

In my opinion, if you have chosen Spring Batch, you have to design your program as per predefined and assumed flow to have a clean code and clean design. From that perspective, your current approach looks better than what you are planning for.

is there another way / best way to store all csv files on memory with using only ItemReader or ItemProcessor as it's a simple getting data from files to Map?

As already specified, you can populate map in processor and have a writer that does nothing. You have to note that chunking will happen anyhow and control will go to writer to commit transaction. In my opinion, if you don't wish to transform a read item before writing, just omit the processor and send items directly from reader to writer ( in chunks ).

By populating map in reader, you will violate Single Responsibility Principle ( SRP ) and that is not advisable.

Is that the itemWriter is an essential step to store these filed on Map?

It is essential as long as you believe in decoupled components and SRP.

Why would somebody need three components if single component can do the job? Your questions make me question as why we even need Spring Batch API / Framework ( to just use - FlatFileItemReader class ? ) ?

Hope it helps !!

Sabir Khan
  • 9,826
  • 7
  • 45
  • 98
  • First thank you for your kind help, i find your answer very interesting. as per your question what specific issues you face by populating map in writer : in fact it's a design mater because i have other ItemWriter in the same package used to populate date as out put on csv files. so I don't want them to be mixed up with the wirter i used to populate items on Map because actually they do not write as they store on memory. – Feres.o Sep 21 '17 at 08:37
  • ywc...Writing shouldn't be confused if its in memory write or persistent write...writing means that read item is processed and ready to be stored. Also, when you use API inbuilt writers , you will find it difficult to store in reader as you are not calling `read()` method yourself. – Sabir Khan Sep 21 '17 at 08:44