I'm writing a function that loads data from a file into memory so that I can use it within my program on demand. The size of the data file (in bytes) is read from a separate JSON file. This is the relevant code that gives me issues:
const uint32_t data_size = json["buffers"][0]["byteLength"].GetUint();
char* data = new char[data_size];
std::ifstream file_bin(path_bin, std::ios::binary);
file_bin.read(data, data_size);
file_bin.close();
After I'm done with the data, I delete it from memory using:
delete[] data;
Loading data multiple times works without an issue, and I have verified there are no memory leaks. However, if the data_size
is too large, it sometimes causes an std::bad_alloc
to be thrown. The issue is persistent, but seemingly random; sometimes I can load and delete the data tens of times before the exception is thrown, sometimes it happens on the fifth load already.
The error is only thrown in the x86 build, but not the x64 one. So I did a reality check by dynamically allocating an int array and the RAM usage got up to 1.8GB before the program crashed, which was to be expected. The issue is that the data size doesn't ever exceed even 1GB. So why would it crash?
By the power of breakpoints I have also checked that data_size
doesn't do anything funky: it is always a positive number and it never exceeds the available RAM (the total amount that the program uses at that point is about 100MB).
The issue can be overcome by catching the exception and then using the forbidden goto statement to try allocating the memory again; it always works on the second try. However this is less than ideal; it makes the code slightly less readable, but it can also cause an infinite loop in case the requested allocation is too large, which will cause the program to hang without returning any error.