======================== LAST UPDATE =========================
I found Celery Signal worker_process_init.connect(some_function)
which allow me to run any function I want before the worker child start running, and in my case I can change the logs handlers as I want...
handler = logging.FileHandler(('my%s.log')%current_process().index, 'a')
logger.addHandler(handler)
(another signal that I found- setup_log was not efficient in my case.)
++++++++++++++++++++++++++
UPDATE ++++++++++++++++++++++++++
I moved the handlet out of my settings.py since i understood it loaded only once, so now it is defined in task.py:
def custom_logger(name):
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
handler = logging.FileHandler(('my%s.log')%current_process().index, 'a')
logger.addHandler(handler)
return logger
and task for example :
@tenant_celery.task(base=HandlersImplTenantTask)
@concurrent_task(task_timeout=1)
def add():
task_id = add.request.id
l = custom_logger(task_id)
l.info("task name is - ADD")
l.info("my worker id is: %s" % current_process().index)
return 5+7
not the most aesthetic solution, but it does work- changing log files on run time..
==============================================================
I'm running Celery task over Django,
I want every celery worker to write to a different log file,
but to the same one every time-->
for example: worker 1 -->myLog1.log ,worker 2 -->myLog2.log
this is how my settings.py looks:
logfile = '%s/%s' % (logdir, APP_NAME + "-" + process_name + '.log')
CELERY_WORKER_LOGFILE = '%s/%s' % (logdir, 'celery_worker.log')
and task.py:
@tenant_celery.task(base=HandlersImplTenantTask)
def get_worker_id():
logdir = '%s/%s' % (os.curdir, 'log')
settings.CELERY_WORKER_LOGFILE = '%s/%s-%s.log' % (logdir, 'celery_worker', current_process().index)
print settings.CELERY_WORKER_LOGFILE
# print the new log file
logger.info("HELLO FROM TASK %s", current_process().index)
# write to the wrong logfile.
and the same for the second task I have.
but although it print a different "LOFFILE" every task,
it keeps writing to the same log (!!) as its appear in settings.py .
my run command is :
celery worker -c 2 --broker=amqp://1000:1000@localhost:5672//1000
is there any way to (realy) change the log file during run time??
(I could not understand if Django log signals can help me)
I also found THIS answer but it does not work for me..
thanks
so I cant configured each time the log file. am I wrong? – amichib May 15 '16 at 11:29