0

======================== LAST UPDATE =========================
I found Celery Signal worker_process_init.connect(some_function)
which allow me to run any function I want before the worker child start running, and in my case I can change the logs handlers as I want...

handler = logging.FileHandler(('my%s.log')%current_process().index, 'a')
logger.addHandler(handler)

(another signal that I found- setup_log was not efficient in my case.)
++++++++++++++++++++++++++ UPDATE ++++++++++++++++++++++++++
I moved the handlet out of my settings.py since i understood it loaded only once, so now it is defined in task.py:

def custom_logger(name):
    logger = logging.getLogger(name)
    logger.setLevel(logging.DEBUG)
    handler = logging.FileHandler(('my%s.log')%current_process().index, 'a')
    logger.addHandler(handler)
return logger

and task for example :

@tenant_celery.task(base=HandlersImplTenantTask)
@concurrent_task(task_timeout=1)
def add():
    task_id = add.request.id
    l = custom_logger(task_id)
    l.info("task name is -   ADD")
    l.info("my worker id is: %s" % current_process().index)
return 5+7

not the most aesthetic solution, but it does work- changing log files on run time..

==============================================================

I'm running Celery task over Django,

I want every celery worker to write to a different log file,

but to the same one every time-->

for example: worker 1 -->myLog1.log ,worker 2 -->myLog2.log

this is how my settings.py looks:

logfile = '%s/%s' % (logdir, APP_NAME + "-" + process_name + '.log')
CELERY_WORKER_LOGFILE = '%s/%s' % (logdir, 'celery_worker.log')

and task.py:

@tenant_celery.task(base=HandlersImplTenantTask)
def get_worker_id():
    logdir = '%s/%s' % (os.curdir, 'log')
    settings.CELERY_WORKER_LOGFILE = '%s/%s-%s.log' % (logdir, 'celery_worker', current_process().index)
    print settings.CELERY_WORKER_LOGFILE
    # print the new log file
    logger.info("HELLO FROM TASK %s", current_process().index)
    #  write to the wrong logfile.

and the same for the second task I have.

but although it print a different "LOFFILE" every task,

it keeps writing to the same log (!!) as its appear in settings.py .

my run command is :

celery worker -c 2 --broker=amqp://1000:1000@localhost:5672//1000

is there any way to (realy) change the log file during run time??

(I could not understand if Django log signals can help me)

I also found THIS answer but it does not work for me..

thanks

Community
  • 1
  • 1
amichib
  • 9
  • 2
  • 7

1 Answers1

1

You know that when running the worker you can specify the logfile via argument like this:

celery -A proj worker -l info -f paht/to/yourfile.log

More on the official doc

Mauro Rocco
  • 4,980
  • 1
  • 26
  • 40
  • thanks for your answer but because I am working with multy - process, it will be problematic to change the running command. or am I missing something? Thank – amichib May 15 '16 at 06:59
  • You mean that you are using celery multi ? Or the init.d script ? In both cases you can specify options for every worker. – Mauro Rocco May 15 '16 at 10:42
  • I mean that I am usind this command: celery worker -c 5 --broker=amqp://1000:1000@localhost:5672//1000.
    so I cant configured each time the log file. am I wrong?
    – amichib May 15 '16 at 11:29
  • Hi you are using 1 worker with concurrency 5. Just use celery multi to lunch 5 different workers with concurrency 1 and different log files. – Mauro Rocco May 16 '16 at 23:18