0

I'm doing integration testing on my services. It is a very chatty network and I'm having a hard time keeping track of the order that different requests are being made in. I would like a way to stream logging messages in the same terminal from different python processes (each their own app/service started in a separate terminal, not mulitprocessing within a single app). They are running on the same machine, so I don't think I should need third party services like the elk stack for this, I just want to redirect my logs on the same machine to the same terminal so that I can see all logs in chronological order.

Environment: Ubuntu 16.04 LTS Python3.8 Virtualenv

Logging is performed with the Python logging module to stdout and I am able to customize the handlers.

Not sure what's best here, to try achieve this at the application level or the OS level. Ideas? Thanks

Neil
  • 3,020
  • 4
  • 25
  • 48
  • Does this answer your question? [Python multiple logger for multiple modules](https://stackoverflow.com/questions/39718895/python-multiple-logger-for-multiple-modules) – Kraay89 Jan 06 '21 at 14:31
  • It does not answer my question. The question there is about multiple modules in the same process launched in the same terminal. – Neil Jan 06 '21 at 14:39
  • 1
    https://docs.python.org/2/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes – Kraay89 Jan 06 '21 at 14:42

0 Answers0