I want to store information about requests to my sites in a quick way that doesn't put a additional strain on my database. The goal is to use this information to prevent abuse and gather information about how users interact with the site (ip, GET/POST, url/action, timestamp).
I'm currently saving a new row on each page request to the database. However, this is wasting resources with an extra database call when the server is also already logging the same information to the nginx log file.
I want to know what I can do to handle this better, I have two ideas I would like to know if there are any better methods.
- CRON job to parse access log each day and save as batch transaction to database.
- RAM cache (redis/memcached) to store data about request, then CRON to save to database.
However, I'm not sure how to store the data if I use a key-value cache in a way I can retrieve all the records and insert them in a database.
I also don't know how to parse the access log in a way that I won't re-read the entries.
How can I record access attempts in an efficient way?