I would like to keep a log in my application for each database change event. Something like a timestamp + username + which model was changed, written to a log file (just log that a model instance was modified, not what that change was). A combination of signals and logging seems like a natural fit.
So in the models' signals I would do:
import logging
...
logger = logging.getLogger(__name__)
logger.info("Order {} deleted by {}".format(order.barcode, request.user.email))
The problem with this is that the info level of the log is full of POST and GET requests. The usual recommendation is if you don't want all those request logs in your log file to go to a higher level like Warning, but it doesn't make sense to label these events as warnings here. Another option is to use a filter to get rid of the GET/POST logs, as recommended here. But by default warning and error level log events also end up in the info log which is also not desired behaviour in this case, so those would have to be filtered out as well. This log file is really just meant to log a 'this user changed this model instance' event.
Also ideally this file would be accessible (read-only) on site by staff level users, so I was thinking of putting this file in the MEDIA dir. If the log would also contain error traces that's probably a bad idea.
It just seems like a hackish solution, what am I overlooking here?