1

How can I enqueue a function which will run a long time?

I want to do the following:

def batch_insert(data): rows.append(MyModel(*data)) if len(rows) > 1000: MyModel.objects.bulk_create(rows)

user3599803
  • 6,435
  • 17
  • 69
  • 130

1 Answers1

0
  1. Make sure that you have the django-rq app installed and registered in your project's settings.py. You will also need the following setting set:

    RQ_QUEUES = {
        "default" : { "USE_REDIS_CACHE" : "jobs" },
    }
    

    and the following added to your CACHES setting:

    CACHES = {
        ...
        {
            "jobs": {
                "BACKEND"  : "django_redis.cache.RedisCache",
                "LOCATION" : "{{YOUR REDIS SERVER ADDRESS}}",
                "OPTIONS"  : {
                    "CLIENT_CLASS": "django_redis.client.DefaultClient",
                }
            }
        }
    }
    
  2. Create a jobs.py file in your app, with the job you'd like to enqueue:

    from myapp.models import MyModel
    from django_rq    import job
    
    @job
    def batch_insert(data):
        rows = []
        rows.append(MyModel(*data))
        if len(rows) > 1000:
            MyModel.objects.bulk_create(rows)
        else:
            for row in rows:
                row.save()
    
  3. Import your job into a view that triggers it

    from myapp.jobs import batch_insert
    
    trigger_batch_insert(request):
        sample_data = # Define your data here
        batch_insert.delay(sample_data) # This runs the job, instead of
                                        # running it synchronously
        return HttpResponse("Job running!")
    
  4. Make sure you hook up the view to a URL route in your urls.py
  5. Make sure your RQ workers are running:

    $ python manage.py rqworker default
    
  6. Send the view a request, and check the console running the RQ workers to see if it worked :)

Julien
  • 5,243
  • 4
  • 34
  • 35