I'm working with EFK and I would like to know if I can put a field that can be modified by the user in a table lens (or something else as long as it is an array).
It's like a note or a comment about the line.
It is possible to put an editable field…
I have deployed an EFK stack in a Kubernetes cluster.
I have configured it in a way where fluentd will fetch Nginx logs as well as PHP logs( both are in JSON format and both are one JSON log per line ).
This is my config:
fluent.conf: |-
…
I followed this DigitalOcean guide successfully. I can access Kibana, ElasticSearch, and I can see the logs of the counter example.
In my Kubernetes cluster I also have a LoadBalancer under default, where I am hosting my website. How can I access…
In an EFK setup, the fluentd suddenly stopped sending to elasticsearch with the following errors in the logs:
2020-09-28 18:48:55 +0000 [warn]: #0 Could not communicate to Elasticsearch, resetting connection and trying again. getaddrinfo: Name or…
I'm using kube-fluentd-operator to aggregate logs using fluentd into Elasticsearch and query them in Kibana.
I can see my application (pods) logs inside the cluster.
However I cannot see the journal logs (systemd units, kubelet, etc) from the hosts…
I try to get log from my application container and attach fluentd log agent as sidecar container in my project. And I want to get which log is coming from which application in my Kibana dashboard. That's why I configured like that in…
Example:
My documents:
{"_id":"1", "data_sent":"100"}
{"_id":"2", "data_sent":"110"}
{"_id":"3", "data_sent":"120"}
I would like to get value of 'data_sent' for every new document and sum it up to another index, lets say
index_name:…
We have EFK implemented on Openshift Container Platform version 4.3.
Issue:
Multiline logs such as Java Stack trace, SQL queries are not getting parsed as a single event in Fluentd and because of this we are getting multiple entries in Kibana.
We…
For testing, I created a file in my home directory:
touch /home/testuser/test.log
I use td-agent to deliver logs to Elasticsearch (EFK).
This is my test configuration in td-agent.conf:
@type tail
path /home/testuser/test.log
…
I'm trying to forward logs to elastic-search and got stuck with setting the index dynamically (by field in the input data).
My input data format is JSON and always have the key "es_idx". I wish to forward to elasticsearch by that key and add it…
I'm runnig EFK stack in my kubernetes cluster however each time i start kibana dashboard i will need to manually import export.ndjson i've heard that all kibana objects are stored in elasticsearch so a mounted this file to…
I am new to Fluentd (and the hole EFK stack). I have Fluentd which sends the logs to Elasticsearch (both in Kubernetes) and I am trying to find a way to find the time when the last (successful) flush of the buffer happened.
I can think of two ways…
I have 1 master and 5 nodes k8s cluster. I am setting EFK with ref: https://www.digitalocean.com/community/tutorials/how-to-set-up-an-elasticsearch-fluentd-and-kibana-efk-logging-stack-on-kubernetes#step-4-%E2%80%94-creating-the-fluentd-daemonset…
I've got the EFK stack installed on kubernetes following this addon: https://github.com/kubernetes/kubernetes/tree/master/cluster/addons/fluentd-elasticsearch
What I want to achieve is having all the logs of the same pod together, and even maybe…
I would like to implement elastic search and Kibana instance as shared between multiple users, where multiple users can have their applications(services) and write their logs in the same of different files. But when they logged from Kibana they can…