2

I have a TFDConnection to a FireBird database for which I apply Data type mapping for backward compatibility with a previous data access technology (SQLDirect):

with FormatOptions.MapRules.Add do     // TIMESTAMP will be ftDateTime instead of ftTimeStamp
begin
   SourceDataType := dtDateTimeStamp;
   TargetDataType := dtDateTime;
end;
with FormatOptions.MapRules.Add do     // FLOAT will be ftFloat instead of ftSingle
begin
   SourceDataType := dtSingle;
   TargetDataType := dtDouble;
end;
FormatOptions.OwnMapRules := true;

At runtime I create a TFDQuery that I link to that TFDConnection.
I can see that it inherits the mapping rules: FormatOptions.MapRules.count=2

I assign an INSERT query to its SQL.Text:

insert into TT_ACT (TT_ACT_ID,TT_PARENT_ID,TT_FROMDATE,TT_TODATE,TT_NAME,TT_NR,TT_CODE,TT_GROUP...)
values (:TT_ACT_ID,:TT_PARENT_ID,:TT_FROMDATE,:TT_TODATE,:TT_NAME,:TT_NR,:TT_CODE,:TT_GROUP,...)

This gives me params.count=42 with parameters with datatype ftUnknown (of course).

I then call Prepare for the query.

If I now inspect a known datetime parameter, I see params[x].datatype = ftTimeStamp, not ftDateTime. So when the query goes back to the database to look at the fields, it does not seem to listen to the data mapping rules when setting up the parameters.

Is this a bug?

In a later stage in my code this got me into trouble, resulting in the famous 338 error:

[FireDac][Phys][IB]-338 Param [TT_FROMDATE] type changed from [ftSQLTimeStamp] to [ftDateTime]. Query must be reprepared. 

I managed to work around that error, so that is not part of the question. But I would expect the Params to follow data type mapping rules as well, that would have made all this easier.

Jan Doggen
  • 8,799
  • 13
  • 70
  • 144

1 Answers1

2

You just misdefined the mapping rule definitions. For parameters, it is transformation of target into source. The Data Type Mapping topic says that as well:

In case of a command parameter, the rule defines a transformation of a target data type, specified by an application, into a source data type, supported by a driver.

So to map command parameters from TIMESTAMP to dtDateTime and FLOAT to dtDouble just swap source with target in your definition:

{ FLOAT → dtDouble in parameters }
with FormatOptions.MapRules.Add do
begin
  SourceDataType := dtDouble; { TFDParam.DataType }
  TargetDataType := dtSingle; { Firebird FLOAT }
end;
{ TIMESTAMP → dtDateTime in parameters }
with FormatOptions.MapRules.Add do
begin
  SourceDataType := dtDateTime; { TFDParam.DataType }
  TargetDataType := dtDateTimeStamp; { Firebird TIMESTAMP }
end;
{ enable custom map rules }
FormatOptions.OwnMapRules := True;

It's worth adding that mapping rules for parameters do the only thing. They only map data types for parameters when command is being prepared (data types must be determinable for them). They're not converting parameter values as they are passed to the driver. Consider this code:

{ Firebird FLOAT equals to dtSingle data type, map it to dtDouble }
with FDQuery1.FormatOptions.MapRules.Add do
begin
  SourceDataType := dtDouble;
  TargetDataType := dtSingle;
end;
FDQuery1.FormatOptions.OwnMapRules := True;
{ setup the command; MyFloat field is Firebird FLOAT }
FDQuery1.SQL.Text := 'INSERT INTO MyTable (MyFloat) VALUES (:MyFloat)';
{ rules are applied when preparing command, so let's prepare it }
FDQuery1.Prepare;
{ now the parameter data type should be dtDouble instead of dtSingle }
if FDQuery1.ParamByName('MyFloat').DataType = dtDouble then
  ShowMessage('Parameter is of dtDouble data type');
{ but you can easily change the parameter data type to another, e.g. by mistake;
  this will change data type to dtSingle, so the whole mapping effort is lost }
FDQuery1.ParamByName('MyFloat').AsSingle := 1.2345;
{ if this would execute (which does not because the parameter data type has been
  changed since the command preparation), parameter map rules would still not be
  involved in converting parameter value for the driver }
FDQuery1.ExecSQL;

So as you can see, it's quite a lot of effort for almost nothing (changing determined parameter data type to another only). Parameter values are converted automatically regardless mapping rules. So, even if your parameter data type won't match DBMS data type but will be convertible, FireDAC will simply convert it for you no matter what (this magic is inside ConvertRawData method):

{ assume MyFloat FLOAT, MyTimeStamp TIMESTAMP in Firebird }
FDQuery1.SQL.Text := 'INSERT INTO MyTable (MyFloat, MyTimeStamp) VALUES (:MyFloat, :MyTimeStamp)';
{ setup parameter data types badly to be dtDouble and dtDateTime }
FDQuery1.ParamByName('MyFloat').AsFloat := 1.2345;
FDQuery1.ParamByName('MyTimeStamp').AsDateTime := Now;
{ and execute; parameter values will be converted automatically to DBMS data types
  dtDouble → dtSingle and dtDateTime → dtDateTimeStamp }
FDQuery1.ExecSQL;

So even here I would repeat, that parameter collections should be defined manually rather than by the DBMS from a prepared command (developer must know what values fill into which fields).

Victoria
  • 7,822
  • 2
  • 21
  • 44
  • O, wow, that's inconvenient. When *querying* data my mapping worked as expected (Without the mapping rules a FireBird TIMESTAMP shows up as ftTimeStamp; with the rules as ftDateTime). If I understand you correctly: If I then want to map for ptInput ParamType, I have to override the mapping for the query in the reverse direction. – Jan Doggen Sep 27 '17 at 07:27
  • 1
    But as you say here and [in your other answer](https://stackoverflow.com/a/46432053/512728) I will rewrite my code to build the parameter collection myself. – Jan Doggen Sep 27 '17 at 07:28
  • Yes, it worked because what you defined are map rules for resultset fields. What I defined are map rules for parameters. It is a bit misleading, but still readable. For parameters, source is parameter, target DBMS field. For resultset field, source is DBMS field, target resultset field. And if you want to use map rules for resultset fields as well as for parameters, you need to define 2 rules for each data type mapping (for both directions). – Victoria Sep 27 '17 at 07:43
  • If you don't prepare the query before assigning the parameter values you don't need the mappings at all ;) –  Sep 27 '17 at 08:29
  • @AplikmUj, correct. Either you prepare a query for determining parameter data types (where mapping is applied) or assign values by which you explicitly setup parameter data types by yourself. I recommended latter. Final value conversions (when the command is executed) doesn't work with mapping rules (as one might expect). They're used for mapping parameter data types, not for value conversions. And parameter data type you can determine only if you prepare the command. – Victoria Sep 27 '17 at 09:36
  • @Victoria Agreed. In general type mappings should not be required at all in case of queries that modify data. The mappings in general describe the transitions from the database field type to a different FireDAC field type than the driver is assuming. The original idea was simply about a convenient way of converting results retrieved from database fields to types matching DB related controls (implicit binding) - columns. I agree there has always been a strong tendency by programmers to regard those mappings as a certain kind of dictionary and enabler for transformation at a meta-data level. –  Sep 27 '17 at 09:46
  • @Victoria What is missing in order to use mappings in order to provide a full featured transformation is a tool AnyDAC was designed to interact with which uses model files from ERD designers (in the middle). The bindings introduced by that framework which is in the position make of of the meta-data. Many mappings can be done by the driver very straight. Translating to a 'wrong' data type in case of writing make no 'perfect' sense. Just wanted to add what you already know or very likely aware of. Jan's problem is the prepare and question is if mappings are the correct repsonse in general. –  Sep 27 '17 at 09:54
  • @AplikmUj, yes, that mapping support for parameters seems to be useless. Parameter values are converted automatically without it. Determining parameter data types I'm finding useless in general. You may find some parameter of type, I don't know, `ftInteger`. Cool, what's next? Let's assign it a value. Take accidentally `AsString` accessor and fill it with a string value (its data type changes), or assign it a string through the `Value` accessor (no data type change). In both cases "no problem" until command execution. So what's the point of having parameter data types determined? – Victoria Sep 27 '17 at 10:54
  • Am I supposed to write `if Param.DataType = ftInteger then Param.Value := 123 else raise Exception.Create('I''m so sorry, my application cannot execute this command because you, or your cat must have had f**k up the database somehow!');` if I'm about to assign an integer :)? No, I just need to assign an integer, so I simply say it so `Param.AsInteger := 123;` and the rest I leave on command execution. FireDAC is smart enough to convert such value to proper underlying DBMS data type (if the value is convertible). – Victoria Sep 27 '17 at 11:12
  • With properly setting up my parameters (as in [my other question](https://stackoverflow.com/questions/46428723/tfdquery-prepare-cannot-determine-parameter-types-for-insert-query-on-ms-sql-ser)) I do not need to redefine 'reverse' mapping rules for my Params. As Victoria wrote, the value assignments work fine now. I actually do Param.DataType = TField.DataType, then Prepare, then Param.Value := TField.Value assignments where the TFields come from another run-time generated dataset (3rd party) which could have any field types. – Jan Doggen Sep 27 '17 at 15:04
  • Jan, for that you can assign values by the `Param.AssignFieldValue(MyField)` method and execute the command. `AssignFieldValue` will try to map data type for the parameter from `TField` and assign its value to the parameter. – Victoria Sep 27 '17 at 16:28