I'm working with a board with an ARM based processor running linux (3.0.35). Board has 1GB RAM and is connected to a fast SSD HD, and to a 5MP camera.
My goal is capturing high resolution images and write those directly to disk.
All goes well until I'm trying to save a very long video (over 1GB of data),
After saving a large file, it seems that I'm unable to reload the camera driver - it fails allocating a large enough DMA memory block for streaming (when calling dma_alloc_coherent()).
I narrowed it down to a scenario where Linux boots (when most of the memory is available), I then write random data into a large file (>1GB), and when I try to load the camera driver it fails.
To my question -
When I open a file for writing, write a large amount of data, and close the file, isn't the memory which was used for writing the data to HD supposed to be freed?
I can understand the why the memory becomes fragmented during the HD access, but when the transactions to the HD are completed - why is the memory still so fragmented that I cannot allocate 15MB of contiguous RAM?
Thanks