How should I decide on the max size of the CLOB fields in the tables Batch_Job_Execution_Context
and Batch_Step_Execution_Context
. Do we need to set this as 4GB always. How do we estimate the max size of this for my project? What is the recommended practice?
Asked
Active
Viewed 817 times
2

jrbedard
- 3,662
- 5
- 30
- 34

user2714010
- 525
- 1
- 5
- 26
-
You can estimate like, how many time you kick off the job per day/month/year. You also can try start a job and see how much storage occupied by Spring batch tables, ask DBA for information. Base on the information, we can see what size we should setup for Spring Batch tables. – Nghia Do Sep 27 '16 at 11:36
-
I was interested to know more on the Serialized_context column and the size that we should set for these two columns in the table. How does this get used, is this used only at the time when a job fails and restarts or this is used everytime when a job runs – user2714010 Sep 27 '16 at 13:53
-
When you store anything to JobExecutionContext or StepExecutionContext, those fields will have data. You can see an example here https://bigzidane.wordpress.com/2016/09/12/spring-batch-partitionerreaderprocesorwriterhibernateintellij/ and see Partitioner level to see how we store to StepExecutionContext. – Nghia Do Sep 27 '16 at 21:50
-
Does it get used when job restarts – user2714010 Sep 28 '16 at 13:08
-
Yes it does use it – Nghia Do Sep 28 '16 at 13:45