I am using a library for loading Wavefront .obj files into my OpenGL application (tinyobjloader). I noticed that there is an error when loading objects. When I load an object with a coordinate of eg. 0.9999999 it is set to 0. By debugging I found out that the following method produces this behaviour:
static inline float parseFloat(const char*& token)
{
token += strspn(token, " \t");
float f = (float)atof(token);
token += strcspn(token, " \t\r");
return f;
}
So atof() returns somehow an int, not a float. I read that some compilers don't throw a warning when using atof() without including "stdlib.h" and the result is that atof() returns an integer.
The curious thing is that even if I include "stdlib.h" the error remains. I can't figure out what causes this behaviour.
Any idea?