0

I'm getting this error with sybase while updating records to my table. Below is my table definition

CREATE TABLE BATCH_STEP_EXECUTION  (
STEP_EXECUTION_ID BIGINT  NOT NULL PRIMARY KEY ,
VERSION BIGINT NOT NULL,
STEP_NAME VARCHAR(100) NOT NULL,
JOB_EXECUTION_ID BIGINT NOT NULL,
START_TIME DATETIME NOT NULL ,
END_TIME DATETIME DEFAULT NULL NULL,
STATUS VARCHAR(10) NULL,
COMMIT_COUNT BIGINT NULL,
READ_COUNT BIGINT NULL,
FILTER_COUNT BIGINT NULL,
WRITE_COUNT BIGINT NULL,
READ_SKIP_COUNT BIGINT NULL,
WRITE_SKIP_COUNT BIGINT NULL,
PROCESS_SKIP_COUNT BIGINT NULL,
ROLLBACK_COUNT BIGINT NULL,
EXIT_CODE VARCHAR(2500) NULL,
EXIT_MESSAGE VARCHAR(2500) NULL,
LAST_UPDATED DATETIME,
constraint JOB_EXEC_STEP_FK foreign key (JOB_EXECUTION_ID)
references BATCH_JOB_EXECUTION(JOB_EXECUTION_ID)
) ;

Below is the error message

org.springframework.dao.DataIntegrityViolationException: 
PreparedStatementCallback; 
SQL [UPDATE BATCH_STEP_EXECUTION set START_TIME = ?, END_TIME = ?, STATUS = 
?, 
COMMIT_COUNT = ?, READ_COUNT = ?, FILTER_COUNT = ?, WRITE_COUNT = ?, 
EXIT_CODE = ?, 
EXIT_MESSAGE = ?, VERSION = ?, READ_SKIP_COUNT = ?, PROCESS_SKIP_COUNT = ?, 
WRITE_SKIP_COUNT = ?, ROLLBACK_COUNT = ?, LAST_UPDATED = ? where 
STEP_EXECUTION_ID = ? 
and VERSION = ?]; Attempt to update or insert row failed because resultant 
row of size 
2633 bytes is larger than the maximum size (1962 bytes) allowed for this 
table.
; nested exception is com.sybase.jdbc4.jdbc.SybSQLException: Attempt to 
update or 
insert row failed because resultant row of size 2633 bytes is larger than the 
maximum 
size (1962 bytes) allowed for this table.

These 2 columns are creating problem.

enter image description here

  • You should post the actual update statement as well as the table definition, because I suspect it's not this table which is breaching the rowsize but rather a work/temp table being created in your update SQL. As an aside if you care about precision, using REAL datatypes is not the way to go. – Rich Campbell Jul 08 '20 at 10:59
  • I have updated the post with the complete error msg.So you are telling that it is because of the table "BATCH_STEP_EXECUTION" ? – Srihari Uttanur Jul 08 '20 at 12:21
  • BATCH_STEP_EXECUTION table is auto generated by my framework. So its difficult to get the table definition – Srihari Uttanur Jul 08 '20 at 12:28
  • 1
    Yes its the BATCH_STEP_EXECUTION table which is too wide - you will need to check what datatypes are being generated for that table (via sp_help tablename or similar). For example check that you aren't randomly creating massive varchar() or similar columns. You cannot get a wider row size without rebuilding the entire database instance onto a bigger page size. – Rich Campbell Jul 08 '20 at 13:41
  • The columns "EXIT_MESSAGE" and "EXIT_CODE" are of type varchar(2500) each. Can I use "sp_chgattribute" and change the row size? – Srihari Uttanur Jul 08 '20 at 14:34
  • I have updated the post with the table definition. – Srihari Uttanur Jul 08 '20 at 15:31
  • 2
    You can't use varchar(2500) columns on a 2k page server which has a maximum row size of 1962 bytes basically - if they are full you will have over 5000 chars vs 1962. You will need to shrink them accordingly or ask your DBA if re-creating the database instance on a larger page size as a project is an option. – Rich Campbell Jul 09 '20 at 07:59

0 Answers0