1

I am using a multiparser to send http POST request and I need to send a big file (4Go). However when I test my code it give me a bad_alloc exception and crashes. I tried it with smaller files and it worked for files smaller than 500Mo, but the bigger the files get the more it crashes randomly. Can you help me?

Here is the code where the crash occurs. It is when it buids the body of the request:

std::ifstream ifile(file.second, std::ios::binary | std::ios::ate); //the file I am trying to send
std::streamsize size = ifile.tellg();
ifile.seekg(0, std::ios::beg);
char *buff = new char[size];
ifile.read(buff, size);
ifile.close();
std::string ret(buff, size); //this make the bad_alloc exception
delete[] buff;
return ret; //return the body of the file in a string

Thanks

Botje
  • 26,269
  • 3
  • 31
  • 41
aabassi
  • 19
  • 1
  • You might want to print the value of size and catch exceptions that "new" might throw if it is out of memory. – U. W. Nov 24 '20 at 08:16
  • 4
    Oh, btw, std::streamsize is a signed int according to this: https://stackoverflow.com/questions/24482028/why-is-stdstreamsize-defined-as-signed-rather-than-unsigned. That would explain why values greater than 2GB pose a problem... – U. W. Nov 24 '20 at 08:19
  • 1
    You should send 4 GB in small chunks.... – Bernd Nov 24 '20 at 08:21
  • 1
    The real problem is that you're trying to read 4GB files into memory in the first place. See if your "multiparser" library (whatever that is) cannot be convinced to read the body of a request from a filehandle or callback instead (like curl does). If it absolutely MUST have a pointer to memory, use `mmap` or `MapViewOfFile` to map the file into your memory in an efficient manner. – Botje Nov 24 '20 at 08:28
  • In addition you should not allocate the data twice, `buff ` and `ret` – Bernd Nov 24 '20 at 08:34

1 Answers1

1

std::string ret(buff, size); creates a copy of buff. So you essentially double the memory consumption

https://www.cplusplus.com/reference/string/string/string/

(5) from buffer

Copies the first n characters from the array of characters pointed by s.

Then the question becomes how much you actually have, respectively how much your OS allows you to get (e.g. ulimit on linux)

As the comments say you should chunk your read and send single chunks with multiple POST requests.

You can loop over the ifstream check if ifile.eof() is reached as exit criteria for your loop:

std::streamsize size = 10*1024*1024 // read max 10MB
while (!ifile.eof())
{
    std::streamsize readBytes = ifile.readsome(buff, size);
    // do your sending of buff here.
}

You need to consider error handling and such to not leak buff or leave ifile open.

Rene Oschmann
  • 676
  • 4
  • 16