I'm looking into using QuestDB for a large amount of financial trade data.
I have read and understood https://questdb.io/docs/guides/importing-data but my case is slightly different.
- I have trade data for multiple instruments.
- For each instrument, the microsecond-timestamped data spans several years.
- The data for each instrument is in a separate CSV file.
My main use case is to query for globally time-ordered sequences of trades for arbitrary subsets of instruments. For clarity, the results of a query would look like
timestamp,instrument,...
0001,A,...
0003,B,...
0004,C,...
0004,A,...
0005,B,...
Hence I believe I need to put all of the trades in a single table with a designated timestamp, and an index on the instruments. I don't necessarily need any of the time series processing features of QuestDB right now, but I do need the sort by timestamp not to run out of memory.
My questions are (beyond whether this approach makes sense in general):
Is there an efficient (and easy) way to merge the data into a single table, given that each CSV file spans years?
Will I need to rebuild the table every time I add a new instrument in the future?