1

I am having a problem with my djang_q_task table. It saves too many records than expected.

I have config like below:

Q_CLUSTER = {
    'name': 'xxxx',
    'workers': 8,
    'recycle': 500,
    'timeout': 600,
    'compress': True,
    'save_limit': 250,
    'queue_limit': 500,
    'cpu_affinity': 1,
    'label': 'Django Q',....

With the save_limit as 250, I am having more than 1M records in the django_q_task table.

Anyone knows why does it save too many records? That causes me some memory issues.

best regards, Thanh Tran

markwalker_
  • 12,078
  • 7
  • 62
  • 99
van thanh tran
  • 177
  • 2
  • 11
  • That config option "Limits the amount of successful tasks saved", so what exactly is it that your task does? Perhaps the task does multiple saves and the task runs multiple times given that the limit on tasks is 250? – markwalker_ Jul 13 '21 at 10:09
  • Hi. Thank you for your help! My task is just saving a record to the dabase – van thanh tran Jul 14 '21 at 06:44
  • Can you try running `./manage.py dbshell` and the SQL query `select success, count(*) from django_q_task group by success;` ? This will tell us how many are successes and how many are failures. – Nick ODell Sep 25 '21 at 18:00

0 Answers0