-1

I am configuring a new Job where I need to read the data from the database and in the processor, the data will be used to call a Rest endpoint with payload. In the payload along with dynamic data, I need to pass reference data which is constant for each record getting processed in the job. This reference data is stored in DB. I am thinking to implement the following approach.

  1. In the beforeJob listener method make a DB call and populate the reference data object and use it for the whole job run.
  2. In the processor make a DB call to get the reference data and cache the query so there will be no DB call to fetch the same data for each record.

Please suggest if these approaches are correct or if there is a better way to implement them in Spring batch.

1 Answers1

1

For performance reasons, I would not recommend doing a DB call in the item processor, unless that is really a requirement.

The first approach seems reasonable to me, since the reference data is constant. You can populate/clear a cache with a JobExecutionListener and use the cache in your chunk-oriented step. Please refer to the following thread for more details and a complete sample: Spring Batch With Annotation and Caching.

Mahmoud Ben Hassine
  • 28,519
  • 3
  • 32
  • 50
  • Thanks Mahmoud. Is there a limitation to size of the data can be added to cache as in my case the reference data is a list of String objects – Sourabh Sharma Sep 25 '22 at 02:48
  • Thanks Mahmoud. I will be using in memory cache similar to your example. I will implement it and check. – Sourabh Sharma Sep 25 '22 at 15:03
  • I left a message here to answer the question in your comment but I could not see it anymore.. strange things. I said that the size limit is bound by the amount of memory you allocate to your cache. This should not be a problem if your reference data does not fit in memory, because you can change the implementation of the cache and use a persistent one instead (like Redis for instance). Anyway, I hope these details helped. – Mahmoud Ben Hassine Sep 26 '22 at 09:15