I would like to know if hdf5 is suitable for real-time data logging or not ?
More precisely: I work on a project in which we want to continuously (sampling rate ranging form 30 to 400Hz) mix a fair amount of data (several hours) of different natures (telemetry, signals, videos).
Data have to be written in real-time (or with a small delay) in order to keep us from losing them on potential crash.
Our first prototype is based on sqlite3, however we feel that some limitations could rise from a long run usage: speed, one database == one file, and difficulties for accessing database from several threads (Lock exception when reading and writing at the same time).
So, I am considering the possibility to use hdf5 as a back-end for data storage on disk (and numpy/pytable for internal representation). Do you think it is possible to update hdf5 file on a regular basis from such python binding ?