I am storing IoT sensor data streaming from mutltiple sources into a few tables in QuestDb and because of the huge size of data, I want to keep only the last 90 days. I can see a few ways on their documentation for deleting, but I don't want to truncate
the table as I would lose latest records. Is there any way instead of making a backup and truncating tables to delete data older than 90 days?
Asked
Active
Viewed 659 times
0

djbobo
- 487
- 2
- 6
1 Answers
1
The strategy here should be creating a table that is partitioned and then dropping partitions based on a time value relative to now. The table also has to have a designated timestamp. This example would perform that:
CREATE TABLE my_table (timestamp TIMESTAMP, x STRING) timestamp(timestamp) PARTITION BY DAY;
In your case, you want to drop partitions older than 90 days, you can do that with ALTER TABLE DROP PARTITION
ALTER TABLE my_table DROP PARTITION
WHERE timestamp < dateadd('d', -90, now())
It might be useful to know that you can also partition by MONTH
or YEAR
.
Beware this is destructive and QuestDB can't recover data that's been deleted in this way, either ensure you have a proper backup method in place or that you're certain you do not need the data any more. There are more details at the ALTER TABLE DROP PARTITION reference pages.
Edit: There is now dedicated docs for this on the Data retention page

Brian Smith
- 1,222
- 7
- 17