0

I am deploying the spring cloud dataflow server using docker. I have created a data processing pipeline inside the dataflow server by deploying couple of spring boot application as source, processor and sink. In order to access the log of each service, I have to either tail it from inside the docker continer ( bash ) or I have to copy it from docker container to the local disk.

I want to push these logs to kafka using log4j-kafka appender for later analysis. I am already doing this for other services running outside the spring cloud dataflow. Is there a way to manage logs of services running inside the spring cloud dataflow using log4j ?

krajwade
  • 133
  • 1
  • 12

2 Answers2

1

Spring Cloud Stream and Spring Cloud Task apps are standalone Spring Boot applications. This SO thread has some insights into the addition of relevant libraries to consistently publish logs from Spring Boot applications to Kafka.

If you were to do this addition to the OOTB apps, too, please check out the patching procedure described in the reference guide.

Community
  • 1
  • 1
Sabby Anandan
  • 5,636
  • 2
  • 12
  • 21
0

If you are running your spring cloud server and Kafka in containers you can use docker link to link these containers where the containers talk with each other for instance $ docker run -d -P --name web --link db training/webapp python app.py where I am linking web app container to a DB container check this link for further info https://docs.docker.com/engine/userguide/networking/default_network/dockerlinks/#communication-across-links

  • The containers are already linked. I guess the problem is that the dataflow server is using default spring logging mechanism instead of using log4j file that is in classpath of each service. – krajwade May 18 '17 at 11:47