Can anybody suggest ideas on how to solve this design problem:
I have an embedded system that generates events (off all sorts, lets just abstract events for the sake of this, they are a data structure that can be serialised).
These events are typically sent directly via internet connection to a server. It is also required that the events are backed up to a file when the internet connection is not available. Then later sent in order of generation to the server when the connection is available again. An added bonus would also be maintaining a history of events in the log file (up to so many days so file size is limited).
The tools I have to use are an arm cortex M4 micro, FatFS on SD card (http://elm-chan.org/fsw/ff/00index_e.html.), FreeRTOS, gcc. All are setup and working.
A previous system I have done managed a head and tail pointer to events in a block of EEPROM that acted like a FIFO queue. I am not sure how to best implement a similar thing using a file system.
So, the problem is mostly around how to do this using a file system.
Any advice appreciated, thanks in advance
Edit: There could be up to 10000 events per day. The device could be offline for up to 10 days. Events contain small amounts of data such as timestamp and a status value or location. the file system contains significantly more storage than required for the maximum buffered history. Eg 10MB would cover 10 days of storage and the SD card will be at least one Gb.
Ashley