Some body help me regarding to the following problem
strFixFactorSide = _T("0.5");
dFixFactorSide = atof((const char *)(LPCTSTR)strFixFactorSide);
"dFixFactorSide" takes value as 0.0000;
How I will get correct value?
Some body help me regarding to the following problem
strFixFactorSide = _T("0.5");
dFixFactorSide = atof((const char *)(LPCTSTR)strFixFactorSide);
"dFixFactorSide" takes value as 0.0000;
How I will get correct value?
Use _tstof()
instead of atof()
, and cast CString to LPCTSTR, and leave it as such, instead of trying to get it to const char *
. Forget about const char *
(LPCSTR) while you're working with unicode and use only const _TCHAR *
(LPCTSTR).
int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
{
int nRetCode = 0;
CString s1 = _T("123.4");
CString s2 = _T("567.8");
double v1 = _tstof((LPCTSTR)s1);
double v2 = _tstof((LPCTSTR)s2);
_tprintf(_T("%.3f"), v1 + v2);
return nRetCode;
}
and running this correctly gives the expected answer.
I think your CString strFixFactorSide
is a Unicode (UTF-16) string.
If it is, the cast (const char *)
only changes the pointer type, but the string it points to still remains Unicode.
atof()
doesn't work with Unicode strings. If you shove L"0.5"
into it, it will fetch bytes 0x30 ('0') and 0x00 (also part of UTF-16 '0'), treat that as a NUL-terminated ASCII string "0"
and convert it to 0.0.
If CString strFixFactorSide
is a Unicode string, you need to either first convert it to an ASCII string and then apply atof()
or use a function capable of converting Unicode strings to numbers. _wtof()
can be used for Unicode strings.