0

Hi I've been reading a lot about this on this forums but I just don't have an idea of what's going wrong right now, looks like everything is ok, but just don't work

I set up my local configuration like this (/etc/default/celeryd):

# or we could have three nodes:
#CELERYD_NODES="w1 w2 w3"

# Absolute or relative path to the 'celery' command:
#CELERY_BIN="/usr/local/bin/celery"
CELERY_BIN="/home/ubuntu/.virtualenvs/wlenv/bin/celery"

# Where to chdir at start.
CELERYD_CHDIR="/var/www/DIR_TO_MANAGE.PY_FOLDER"

# Python interpreter from environment.
ENV_PYTHON="/home/ubuntu/.virtualenvs/wlenv/bin/python"
#ENV_PYTHON="/usr/bin/python2.7"

# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="sec.settings"

# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$CELERYD_CHDIR/manage.py celeryd_multi"

# Extra arguments to celeryd
CELERYD_OPTS="--time-limit 300 --concurrency=8"

# Name of the celery config module.
CELERY_CONFIG_MODULE="celeryconfig"

# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/logs/celery/log/%n.log"
CELERYD_PID_FILE="/logs/celery/run/%n.pid"

# Workers should run as an unprivileged user.
CELERYD_USER="ubuntu"
CELERYD_GROUP="ubuntu"

# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1

When I run /etc/init.d/celeryd start I get this:

celeryd-multi v3.0.9 (Chiastic Slide)
> Starting nodes...
    > celery.ip-10-51-179-42: OK
    > 300.ip-10-51-179-42: OK

But the workers are not running (/etc/init.d/celeryd status):

Error: No nodes replied within time constraint.

I read something about run like this (sh -x /etc/init.d/celeryd start) and find the error, most of the time is a file permissions error but I don't see nothing wrong

+ DEFAULT_PID_FILE=/logs/celery/run/celeryd@%n.pid
+ DEFAULT_LOG_FILE=/logs/celery/log/celeryd@%n.log
+ DEFAULT_LOG_LEVEL=INFO
+ DEFAULT_NODES=celery
+ DEFAULT_CELERYD=-m celery.bin.celeryd_detach
+ CELERY_DEFAULTS=/etc/default/celeryd
+ test -f /etc/default/celeryd
+ . /etc/default/celeryd
+ CELERY_BIN=/home/ubuntu/.virtualenvs/wlenv/bin/celery
+ CELERYD_CHDIR=/var/www/DIR_TO_MANAGE.PY_FOLDER
+ ENV_PYTHON=/home/ubuntu/.virtualenvs/wlenv/bin/python
+ export DJANGO_SETTINGS_MODULE=sec.settings
+ CELERYD_MULTI=/var/www/DIR_TO_MANAGE.PY_FOLDER/manage.py celeryd_multi
+ CELERYD_OPTS=--time-limit 300 --concurrency=8
+ CELERY_CONFIG_MODULE=celeryconfig
+ CELERYD_LOG_FILE=/logs/celery/log/%n.log
+ CELERYD_PID_FILE=/logs/celery/run/%n.pid
+ CELERYD_USER=ubuntu
+ CELERYD_GROUP=ubuntu
+ CELERY_CREATE_DIRS=1
+ [ -f /etc/default/celeryd ]
+ . /etc/default/celeryd
+ CELERY_BIN=/home/ubuntu/.virtualenvs/wlenv/bin/celery
+ CELERYD_CHDIR=/var/www/DIR_TO_MANAGE.PY_FOLDER
+ ENV_PYTHON=/home/ubuntu/.virtualenvs/wlenv/bin/python
+ export DJANGO_SETTINGS_MODULE=sec.settings
+ CELERYD_MULTI=/var/www/DIR_TO_MANAGE.PY_FOLDER/manage.py celeryd_multi
+ CELERYD_OPTS=--time-limit 300 --concurrency=8
+ CELERY_CONFIG_MODULE=celeryconfig
+ CELERYD_LOG_FILE=/logs/celery/log/%n.log
+ CELERYD_PID_FILE=/logs/celery/run/%n.pid
+ CELERYD_USER=ubuntu
+ CELERYD_GROUP=ubuntu
+ CELERY_CREATE_DIRS=1
+ CELERYD_PID_FILE=/logs/celery/run/%n.pid
+ CELERYD_LOG_FILE=/logs/celery/log/%n.log
+ CELERYD_LOG_LEVEL=INFO
+ CELERYD_MULTI=/var/www/DIR_TO_MANAGE.PY_FOLDER/manage.py celeryd_multi
+ CELERYD=-m celery.bin.celeryd_detach
+ CELERYCTL=celeryctl
+ CELERYD_NODES=celery
+ export CELERY_LOADER
+ [ -n  ]
+ dirname /logs/celery/log/%n.log
+ CELERYD_LOG_DIR=/logs/celery/log
+ dirname /logs/celery/run/%n.pid
+ CELERYD_PID_DIR=/logs/celery/run
+ [ ! -d /logs/celery/log ]
+ [ ! -d /logs/celery/run ]
+ [ -n ubuntu ]
+ DAEMON_OPTS= --uid=ubuntu
+ chown ubuntu /logs/celery/log /logs/celery/run
+ [ -n ubuntu ]
+ DAEMON_OPTS= --uid=ubuntu --gid=ubuntu
+ chgrp ubuntu /logs/celery/log /logs/celery/run
+ [ -n /var/www/DIR_TO_MANAGE.PY_FOLDER/contracts ]
+ DAEMON_OPTS= --uid=ubuntu --gid=ubuntu --workdir="/var/www/DIR_TO_MANAGE.PY_FOLDER/contracts"
+ export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/sbin:/sbin
+ check_dev_null
+ [ ! -c /dev/null ]
+ check_paths
+ dirname /logs/celery/run/%n.pid
+ ensure_dir /logs/celery/run
+ [ -d /logs/celery/run ]
+ mkdir -p /logs/celery/run
+ chown ubuntu:ubuntu /logs/celery/run
+ chmod 02755 /logs/celery/run
+ dirname /logs/celery/log/%n.log
+ ensure_dir /logs/celery/log
+ [ -d /logs/celery/log ]
+ mkdir -p /logs/celery/log
+ chown ubuntu:ubuntu /logs/celery/log
+ chmod 02755 /logs/celery/log
+ start_workers
+ /var/www/DIR_TO_MANAGE.PY_FOLDER/manage.py celeryd_multi start celery --uid=ubuntu --gid=ubuntu --workdir="/var/www/DIR_TO_MANAGE.PY_FOLDER" --pidfile=/logs/celery/run/%n.pid --logfile=/logs/celery/log/%n.log --loglevel=INFO --cmd=-m celery.bin.celeryd_detach --time-limit 300 --concurrency=8
celeryd-multi v3.0.9 (Chiastic Slide)
> Starting nodes...
    > celery.ip-10-51-179-42: OK
    > 300.ip-10-51-179-42: OK
+ exit 0

Any ideas?

Chriss Mejía
  • 110
  • 1
  • 10
  • Playing with the config I notice I had commented: #CELERYD_NODES="w1 w2 w3" I change to: CELERYD_NODES="w1 w2 w3" And now I get: `> Starting nodes... > w1.ip-10-51-179-42: OK > w2.ip-10-51-179-42: OK > w3.ip-10-51-179-42: OK > 300.ip-10-51-179-42: OK + exit 0` But is still failing: `/etc/init.d/celeryd status Error: No nodes replied within time constraint.` – Chriss Mejía May 15 '15 at 06:34

1 Answers1

0

Which version of celery are you using? When you debugged you used "C_FAKEFORK=1 sh -x /etc/init.d/celeryd start" (with C_FAKEFORK=1) right?

If you are using the version 3.x+ you dont need to use "manage.py celery" (djangp-celery) instead you have to use the "celery" command which come with celery itself. Take a look to this part of the doc documentation. Thanks!

Martin Alderete
  • 736
  • 5
  • 9