0

Situation is as follows: With upgrading from XE3 (anyDAC) to XE7 (Firedac), we suddenly are getting issues on updating a charachter field > 255 characters on Informix.

We now get the error :

'[FireDAC][Phys][ODBC]-345. Data too large for variable [#1]. Max len = [256], actual len = [1000] Hint: set the TFDParam.Size to a greater value'

Problem is we are using cached update mode in combination with use of 'applyupdates', so we don't have any parameters to set (except for the PK)...

FYI: the table definition where we want to update a record:

CREATE TABLE com_monster_im ( 
monim_id INTEGER NOT NULL,
com_monster CHAR(1000),
PRIMARY KEY(monim_id));

The FDQuery components consists of following query:

SELECT monim_id, com_monster
FROM com_monster_im
WHERE monim_id = :paramMonimId

We don't use persistent fields or whatsoever.

Copilot
  • 782
  • 6
  • 17
  • Curious. It looks like you may be hitting a bug in a 'FireDAC' ODBC driver, or you need to configure your ODBC driver as it says. Error -345 might be from Informix, but the lengths would not normally be part of an Informix message, so FireDAC is doing some interpreting. Using CHAR(1000) is expensive; you should review using LVARCHAR(1000) instead unless your data will normally be almost all of 1000 bytes long. Informix itself (e.g. via Informix ESQL/C) will be OK with being passed a CHAR(1000) value -- heck, it'll take CHAR(32000) without problem. So I think the trouble is in the driver. – Jonathan Leffler Jan 23 '15 at 22:52
  • Jonathan, I know the dimension of a char(1000) is not ideal, but this table is a legacy table... Changing the data type to lvarchar can't be done for the moment... But it's my opinion too I hit some bug in the firedac driver. – Copilot Jan 24 '15 at 09:07

1 Answers1

0

Add a mapping rule like this:

Data mapping rule to add

And the error will disappear

lechonex
  • 145
  • 3