I'm getting huge differences when I pass a float from C# to C++. I'm passing a dynamic float wich changes over time. With a debugger I get this:
c++ lonVel -0.036019072 float
c# lonVel -0.029392920 float
I did set my MSVC++2010 floating point model to /fp:fast which should be the standard in .NET if I'm not mistaken, but this didn't help.
Now I can't give out the code but I can show a fraction of it.
From C# side it looks like this:
namespace Example
{
public class Wheel
{
public bool loging = true;
#region Members
public IntPtr nativeWheelObject;
#endregion Members
public Wheel()
{
this.nativeWheelObject = Sim.Dll_Wheel_Add();
return;
}
#region Wrapper methods
public void SetVelocity(float lonRoadVelocity,float latRoadVelocity {
Sim.Dll_Wheel_SetVelocity(this.nativeWheelObject, lonRoadVelocity, latRoadVelocity);
}
#endregion Wrapper methods
}
internal class Sim
{
#region PInvokes
[DllImport(pluginName, CallingConvention=CallingConvention.Cdecl)]
public static extern void Dll_Wheel_SetVelocity(IntPtr wheel,
float lonRoadVelocity, float latRoadVelocity);
#endregion PInvokes
}
}
And in C++ side @ exportFunctions.cpp:
EXPORT_API void Dll_Wheel_SetVelocity(CarWheel* wheel, float lonRoadVelocity,
float latRoadVelocity) {
wheel->SetVelocity(lonRoadVelocity,latRoadVelocity);
}
So any sugestions on what I should do in order to get 1:1 results or at least 99% correct results.