I'm using kube-fluentd-operator to aggregate logs using fluentd
into Elasticsearch
and query them in Kibana
.
I can see my application (pods) logs inside the cluster. However I cannot see the journal logs (systemd units, kubelet, etc) from the hosts inside the cluster.
There are no noticeable messages in fluentd's pods logs and the stack works for logs coming from applications.
Inside the fluentd
container I have access to the /var/log/journal
directory (drwxr-sr-x 3 root 101 4096 May 21 12:37 journal
).
Where should I look next to get the journald logs in my EFK stack?
Here's the kube-system.conf
file attached to the kube-system
namespace:
<match systemd.** kube.kube-system.** k8s.** docker>
# all k8s-internal and OS-level logs
@type elasticsearch
host "logs-es-http.logs"
port "9200"
scheme "https"
ssl_verify false
user "u1"
password "password"
logstash_format true
#with_transporter_log true
#@log_level debug
validate_client_version true
ssl_version TLSv1_2
</match>
Minimal, simple, according to the docs.
Is it possible that my search terms are wrong? What should I search for in order to get the journal logs?