2

I have been using Spring Batch and so far everything has run pretty well. Lately though, we have been getting issues related to the step execution context. The error is below:

Encountered an error saving batch meta data for step generatePlacementSupplement in job vodPlacementJob. This job is now in an unknown state and should not be restarted. org.springframework.dao.DataIntegrityViolationException: PreparedStatementCallback; SQL [UPDATE BATCH_STEP_EXECUTION_CONTEXT SET SHORT_CONTEXT = ?, SERIALIZED_CONTEXT = ? WHERE STEP_EXECUTION_ID = ?]; ERROR: invalid input syntax for type oid

I have tried changing the Spring Batch execution context tables to a TEXT data type, as well as changing the VARCHAR limit to 10k instead of 2500. Neither of these approaches seemed to have fixed whatever the problem is. I think it's a space issue, as we are passing quite a bit of parameters, but adding more space didn't fix it.

Would moving away from the Postgres DB and use an in-memory solution be the best course?

Matthew Ailes
  • 21
  • 1
  • 4
  • Step execution context should be as small as possible because it is stored to metadata tables and you may stumble into this type of errors. Check if you can refactor step execution or your code to minimize step execution context size. – Luca Basso Ricci Jan 19 '17 at 06:03

1 Answers1

1

I think you'll find that the Spring Batch source code actually knows the column sizes (has constants for them) and will truncate data to fit the column size. There are even some methods that maybe can be used to override the defaults. Whether it's a good idea to override this stuff I'll let you decide, but I believe it's possible. (So I'm saying you'd have to not only change the database columns, but also mess with the Spring Batch source code, or least call some APIs to tell it that you have increased the DB column sizes.)