0

I have a Django rest framework app that calls 2 huey tasks in succession in a serializer create method like so:

...
def create(self, validated_data):
    user = self.context['request'].user
    player_ids = validated_data.get('players', [])
    game = Game.objects.create()

    tasks.make_players_friends_task(player_ids)
    tasks.send_notification_task(user.id, game.id)
    return game

# tasks.py
@db_task()
def make_players_friends_task(ids):
    players = User.objects.filter(id__in=ids)
    # process players

@db_task()
def send_notification_task(user_id, game_id):
    user = User.objects.get(id=user_id)
    game = Game.objects.get(id=game_id)
    # send notifications

When running the huey process in the terminal, when I hit this endpoint, I can see that only one or the other of the tasks is ever called, but never both. I am running huey with the default settings (redis with 1 thread worker.)

If I alter the code so that I am passing in the objects themselves as parameters, rather than the ids, and remove the django queries in the @db_task methods, things seem to work alright.

The reason I initially used the ids as parameters is because I assumed (or read somewhere) that huey uses json serialization as default, but after looking into it, pickle is actually the default serializer.

One theory is that since I am only running one worker, and also have a @db_periodic_task method in the app, the process can only handle listening for tasks or executing them at any time, but not both. This is the way celery seems to work, where you need a separate process for a scheduler and a worker each, but this isn't mentioned in huey's documentation.

Cameron Sima
  • 5,086
  • 6
  • 28
  • 47

1 Answers1

0

If you run the huey consumer it will actually spawn a separate scheduler together with the amount of workers you've specified, so that's not going to be your problem.

You're not giving enough information to actually properly see what's going wrong so check the following:

  • If you run the huey consumer in the terminal, observe whether all your tasks show up as properly registered so that the consumer is actually capable of consuming them.
  • Check whether your redis process is running.
  • Try performing the tasks with a blocking call to see on which tasks it fails:

    task_result = tasks.make_players_friends_task(player_ids)
    task_result.get(blocking=True)
    task_result = tasks.send_notification_task(user.id, game.id)
    task_result.get(blocking=True)
    

    Do this with a debugger or print statements to see whether it makes it to the end of your function or where it gets stuck.

  • Make sure to always restart your consumer when you change code. It doesn't automatically pick up new code like the django dev server. The fact that your code works as intended while pickling whole objects instead of passing id's could point to this, as it would be really weird that this would break it. On the other hand, you shouldn't pass in django ORM objects. It makes way more sense to use your id approach.

Glenn D.J.
  • 1,874
  • 1
  • 8
  • 19
  • I'm restarting everything on code change with a bash script (gunicorn and huey) So I dont think that's it. I've seen code errors thrown in the terminal running the huey process before, and that's not the case here. The weird thing is one or the other is called successfully, seemingly at random on each request, so that obviates an error being thrown in my code – Cameron Sima Apr 03 '20 at 14:29
  • Did you try the code where it would block until it receives a result? I think that will help you to really pinpoint what's going wrong. It even sounds like you could be accidentally running 2 consumers unintentionally. – Glenn D.J. Apr 06 '20 at 16:46