2

I have setup elastic stack on kubernetes private cloud and I am running filebeat on the K8 nodes. Filebeat sends logs of some of the containers to logstash which are eventually seen on Kibana but some container logs are not shown because they are probably not harvested in first place. What is the mistake I am doing?

Filebeat is able to read from paths such as /var/lib/docker/containers/7a36cc887cc4ba1cea8ebedcf5ed8c74fee9e6cd307bac5e1ba795d07369ca2d/7a36cc887cc4ba1cea8ebedcf5ed8c74fee9e6cd307bac5e1ba795d07369ca2d-json.log. I have jupyterhub, cassandra, sftp services running on my K8 cluster. The logs that I see on doing kubectl logs -f are fetched by filebeat as well but there are some user applications running on my K8 cluster as well. The logs which I see on doing kubectl logs -f are not being fetched by filebeat.

kind: ConfigMap
metadata:
  name: filebeat-config
  namespace: kube-logging
  labels:
    k8s-app: filebeat
    kubernetes.io/cluster-service: "true"
data:
  filebeat.yml: |-
    filebeat.config:
      prospectors:
        # Mounted `filebeat-prospectors` configmap:
        path: ${path.config}/prospectors.d/*.yml
        # Reload prospectors configs as they change:
        reload.enabled: false
      modules:
        path: ${path.config}/modules.d/*.yml
        # Reload module configs as they change:
        reload.enabled: false
    processors:
      - add_cloud_metadata:

    cloud.id: ${ELASTIC_CLOUD_ID}
    cloud.auth: ${ELASTIC_CLOUD_AUTH}

    output.logstash:
      hosts: ['logstash-service:5044']
---
apiVersion: v1
kind: ConfigMap
metadata:
  name: filebeat-prospectors
  namespace: kube-logging
  labels:
    k8s-app: filebeat
    kubernetes.io/cluster-service: "true"
data:
  kubernetes.yml: |-
    - type: docker
      containers.ids:
      - "*"
      processors:
        - add_kubernetes_metadata:
            in_cluster: true
---

I want logs from all of my containers to be fetched by filebeat and shown in Kibana. How can achieve this, what is the missing link?

  • The mistake you are doing is not including any details in your question about what "some container logs are not shown" means; so you are getting _some_ container logs but not others? if not others, is there something special about them -- short lived? not visible to kubernetes? other? – mdaniel May 22 '19 at 04:17
  • @MatthewLDaniel I have edited the question, hope this helps you to help me ! – Nitesh Ratnaparkhe May 22 '19 at 08:20
  • I am having the same problem. My case, logs are not shown in kibana. Did you find a solution? – Sachith Muhandiram Jul 31 '20 at 12:50
  • 1
    @SachithMuhandiram the issue that you are saying is very broad. Are the container logs getting created at var/lib/docker/containerid, is the beats given the correct path to read container logs, are the logs getting ingested into elasticsearch server? – Nitesh Ratnaparkhe Aug 03 '20 at 15:04

0 Answers0