I have an application which uses ADO.NET datasets and data adapters in combination with remoting (client/server architecture, transmitting datasets via remoting).
I now face the following issue:
TL;DR: double value 44.850000925362000
turns into 44.850000925362004
after sending a dataset via remoting to the server.
I create a new row in the database by saving the dataset, which contains a float column (mapped to double in the dataset). The double value saved is 44.850000925362
I then read this row from the database (DataAdapter.Fill
) and get the same value (checked with BitConverter.DoubleToInt64
). This dataset passes via remoting to the client and is then merged into a usecase dataset on the client. Still retaining the same value.
This dataset is then merged into a usecase-dataset, the row from that is imported into a different table (because read from a view, saved to table) and a value is changed before transmitting the usecase-dataset (now containing a row in the other table).
On the client-side the value is still the same - but as soon as the dataset reaches the server, the value in question is different (although no changes were made to that specific column - it is still Unchanged
and even the original value is different).
Example:
Save 44.850000925362000
Read 44.850000925362000
Merge, Import, modify row - still 44.850000925362000
Send to server for saving, is 44.850000925362004
on server!
...which then causes a ConcurrencyException
because the record is saved with 44.850000925362000
- but the data adapter update uses 44.850000925362004
in the WHERE
condition (optimistic concurrency).
Nobody else touched that row in between.
Update
I tried setting up a test server and everything works fine there. Funny thing is: the same assembly works fine if I use it in a different service. I can't find anything in the config or startup which would explain this. I'm using a binary formatter on both, both are .NET 4.0, both use the same sourcecode... but one behaves different than the other.
Further update
I even captured the SQL statement that was being executed for the update. If I run the parameters as a WHERE
clause in a SELECT
statement, it fetches the correct record. So when I do this manually (via SQL Management Studio), it accepts a small delta between the value in the row and the value I give it for a condition.
Still, it doesn't work at all when running the update via the adapter.
Anyway, I've given up. I've resorted to rounding it to 5 digits - way more precision than what I need in that usecase anyway. Might yield weird results if the number gets large but I don't expect that in that use case (we're talking about weight in kilograms).