0

My current problem is to track the last 250 actions of each user using an app - these need to be stored in some kind of database, with reasonably fast access and fairly fast writes.

Up till now, I have just saved all actions of all users to a database, but the size of the database is completely blowing up. All messages older than 250 messages (per user) can be discarded, but SQL based libraries do not provide a reasonable way to do this.

Optimally, the database system would use circular buffers of non-fixed length. The problem with pre-allocating all the space required will be impossible (over 90% of users only ever perform < 100 actions, meaning pre-allocated space would be unfeasible for memory reasons). Additionally, for memory reasons, the size of entries needs to be dynamic, since allocating the maximum message length for each message will cause alot of empty space.

I was considering writing such a database system myself, using small (256byte) linked equally-sized chunks of data and keeping track of empty spaces. Obviously I would prefer a pre-made solution, since writing an own database system is time-consuming and error-prone.

Is there any system that - more or less - does what I intend to do? If not, what approach is the best towards writing such a database system myself?

CBenni
  • 555
  • 1
  • 7
  • 20
  • What kind of actions can the users make? what is the current structure of your database? what rdbms are you working with? – Zohar Peled Jun 13 '15 at 08:19
  • @ZoharPeled The current system is PostgreSQL, and the data is basically just time-series data - a time stamp, username and payload (a varchar) – CBenni Jun 13 '15 at 11:12
  • I have no expierience with PostgreSQL, but from looking at the documentation you can use an after insert trigger on the table and just delete the records for the user where there are at least 250 records for that user after them... I can give a code example in Sql server if that helps. – Zohar Peled Jun 13 '15 at 11:21
  • @ZoharPeled I tried a code snippet I found in postgreSQL, but it was not able to manage 100+ actions per second, or anything close to that. If there is an efficient method, I would love to see that however. – CBenni Jun 13 '15 at 13:05
  • As I said, I have no experience with PostgreSQL, so I really can't answer that. Perhaps you should try to find a way to schedule an execution of a stored procedure that will cleanup old records. (It's fairly easy in Sql server) – Zohar Peled Jun 14 '15 at 06:52

0 Answers0