I run three services in three different containers. The logs for these services are sent to the system so if I run these on a Linux server, I can see the logs with journalctl.
Also, if I run the services in Docker containers, I can gather the logs with docker logs <container_name> or from /var/lib/docker/containers directory. But when I move to Kubernetes (Microk8s), I cannot retrieve them with kubectl logs command, and there are also no logs in /var/log/containers or /var/log/pods.
If I login to the pods, I can see that the processes are running, but without logs I couldn't say if there are running correctly. Also, I tried to change the runtime of microk8s kubelet from containerd to docker, but still I can't get any logs.
# kubectl get po -o wide
NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES
amf-deployment-7785db9758-h24kz 1/1 Running 0 72s 10.1.243.237 ubuntu <none>
# kubectl describe po amf-deployment-7785db9758-h24kz
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduled 87s default-scheduler Successfully assigned default/amf-deployment-7785db9758-h24kz to ubuntu
Normal AddedInterface 86s multus Add eth0 [10.1.243.237/32]
Normal Pulled 86s kubelet Container image "amf:latest" already present on machine
Normal Created 86s kubelet Created container amf
Normal Started 86s kubelet Started container amf
# kubectl logs amf-deployment-7785db9758-h24kz
# kubectl logs -f amf-deployment-7785db9758-h24kz
^C
You can see in the following screenshot the difference of running the same container with Docker and running it with Kubernetes. The behaviour seems very strange, since the logs can be gathered if the application run as an independent Docker container, but not when it is running with Kubernetes.enter image description here