3

In BATCH_JOB_EXECUTION_PARAMS table, column "STRING_VAL" is defined as varchar(250). If any string longer than 250 is passed as job parameter, the database will complain that data is too long. I did some research and what some people did was to manually change the definition of the column to hold more data. Is there any side effect to store large params in the table? If so, what is the best solution to pass a large job param?

Thanks.

X. L.
  • 31
  • 2

2 Answers2

2

There shouldn't be a side effect; especially, if it is a non-identifiying parameter.
But also then, the only place this could have a sideeffect is the generation of the "JOB_KEY"-field in the JOB_INSTANCE table (have a look at JdbcJobInstanceDao).
The content for this field is generated using a "JobKeyGenerator" and having a look at the used default implementation "org.springframework.batch.core.DefaultJobKeyGenerator", I don't see anything that could cause a side effect.

Hansjoerg Wingeier
  • 4,274
  • 4
  • 17
  • 25
1

I would not go down that road since it is part of Spring Framework developed outside of your control. Even if it is safe now to change what if they decide to use 250 character limit in some important framework functionality. You will get either funny bugs when you upgrade to new version or you will get version lock since you changed library code yourself.

I answered similar question in this post. You can create new table for holding parameters or whatever near Spring Batch meta data (in same database) and you can pass just ID. Inside Spring Batch job you can pull whatever from that table based on passed ID.

Community
  • 1
  • 1
Nenad Bozic
  • 3,724
  • 19
  • 45