0

I have a vector in C++ that I want to write it to a .bin file. this vector's type is byte, and the number of bytes could be huge, maybe millions. I am doing it like this:

if (depthQueue.empty())
    return;

FILE* pFiledep;

pFiledep = fopen("depth.bin", "wb");

if (pFiledep == NULL)
    return;

byte* depthbuff = (byte*) malloc(depthQueue.size() * 320 * 240 * sizeof(byte));

if(depthbuff)
{
  for(int m = 0; m < depthQueue.size(); m++)
  {
    byte b = depthQueue[m];
    depthbuff[m] = b;
  }

  fwrite(depthbuff, sizeof(byte),
        depthQueue.size() * 320 * 240 * sizeof(byte), pFiledep);
  fclose(pFiledep);
  free(depthbuff);
}

depthQueue is my vector which contains bytes and lets say its size is 100,000.
Sometimes I don't get this error, but the bin file is empty.
Sometime I get heap error.
Somtimes when I debug this, it seems that malloc doesn't allocate the space. Is the problem is with space?

Or is chunk of sequential memory is so long and it can't write in bin?

Shimmy Weitzhandler
  • 101,809
  • 122
  • 424
  • 632
user667222
  • 179
  • 3
  • 16

1 Answers1

2

You don't need hardly any of that. vector contents are guaranteed to be contiguous in memory, so you can just write from it directly:

fwrite(&depthQueue[0], sizeof (Byte), depthQueue.size(), pFiledep);

Note a possible bug in your code: if the vector is indeed vector<Byte>, then you should not be multiplying its size by 320*240.

EDIT: More fixes to the fwrite() call: The 2nd parameter already contains the sizeof (Byte) factor, so don't do that multiplication again in the 3rd parameter either (even though sizeof (Byte) is probably 1 so it doesn't matter).

j_random_hacker
  • 50,331
  • 10
  • 105
  • 169
  • Concurrency::concurrent_vector depthQueue; – user667222 Nov 18 '12 at 06:00
  • @user667222: I don't know what a `Concurrency::concurrent_vector` is -- you need to check its docs to see whether it provides the same contiguity guarantee. Also be consisten: is it `byte` or `Byte`? – j_random_hacker Nov 18 '12 at 06:02
  • @user667222: Given that you seem to be dealing with concurrency, if any other thread or process can write to the vector, you'll probably need to lock it before the `fwrite()` call and unlock it afterwards. – j_random_hacker Nov 18 '12 at 06:03
  • for now there is no concurrency issues, think of it as normal vector<> – user667222 Nov 18 '12 at 06:06
  • @user667222: You can't just "think of it as normal vector" -- whether my code works depends on whether the type guarantees that all elements will be laid out contiguously in memory. It probably does, but *you need to check that*. – j_random_hacker Nov 18 '12 at 06:08
  • and the type is BYTE not byte or Byte – user667222 Nov 18 '12 at 06:09
  • i did the modification you said, there is no error BUT the created bin file is empty, could it be because of the size? writing of millions of BYTE? – user667222 Nov 18 '12 at 06:59
  • @user667222: Writing millions of bytes at a time should not be a problem. The one problem I can remember having was when trying to write blocks >64Kb to a network drive on Windows a few years ago, but that seems unlikely. Is the disk full? Are you calling `fclose()`? Try changing the type to `std::vector` and seeing if that works. – j_random_hacker Nov 18 '12 at 07:28
  • thanks for your help, here is the issues: actually i am writing the depth values from kinect to a bin file so i have a thread which reads depth values for me, so to prevent invalid allocations for read/write to that depth i am using Concurrency::concurrent_vector instead of std::vector that you just said. i changed to std::vector and i saw values in Bin file but now as i said i got heap errors. when i am using concurrency the values are in already in depthQueue and when fwrite there is no error but bin file come up empty !! appreciate helps (i am doing fclose as you see in the code – user667222 Nov 18 '12 at 09:49
  • @user667222: My guess is that the concurrent vector type doesn't support directly accessing the underlying memory -- as I said before, *check its documentation*. – j_random_hacker Nov 18 '12 at 10:22
  • ok thanks, it seems that the problem was because of the .bin file. when i wrote large amount of data the file is empty but when i write a small chunk it is ok, before i tried to store all frames depth at once in a file now for each frame i am creating an individual bin file and it is ok! i dont know the limitations of the bin file, maybe it has a maximum capacity or line length .. – user667222 Nov 19 '12 at 23:00
  • @user667222: You *know* it's not the bin file, because it works when you use `std::vector`! The problem is either the limitations of the `concurrent_vector` type (READ ITS DOCS) or with your concurrency logic. – j_random_hacker Nov 19 '12 at 23:42