0

One of my Django websites was recently found by some crawler that went nuts over the login page, requesting it 100-200 times per minute for a couple days before I noticed. I only found out because my database backups started exploding in size, specifically because of the django_sessions table. I have django-admin.py clearsessions set to run daily to clean out expired sessions, but the millions of sessions created wouldn't be touched by the command. The crawler didn't actually try to log in, but

I ended up blocking the bot, adding rel='nofollow' to login links, and adding Disallow: /login/ to robots.txt, but for the long-term this still suggests that something can come along and fill up my database with garbage. How do I avoid this? I don't know why I even care about sessions for users that aren't logged in, can I restrict sessions to them?

My sessions middleware is django.contrib.sessions.middleware.SessionMiddleware and I haven't specified the SESSION_ENGINE (so it's default, I presume).

Nick T
  • 25,754
  • 12
  • 83
  • 121
  • the `clearsessions` command only removes expired sessions. You could look at shortening your session timeout. but really, i wouldn't worry about it. – warath-coder Feb 12 '15 at 00:15
  • @warath-coder the millions of extra sessions cause my backups to explode to 100x their previous size and makes JSON backups impossible (small webserver kills the process because it runs out of memory). – Nick T Feb 12 '15 at 00:29
  • try the shorter timeout; or what I've done is track the sessions that are tied to user's who are logged in; do that; then you can delete all the other sessions prior to a backup. (or just don't backup the session table, don't really need it for disaster recovery). – warath-coder Feb 12 '15 at 00:35

0 Answers0