Questions tagged [logstash-file]

Logstash File Input stream events from Files

221 questions
0
votes
0 answers

I need to transfer dataset from power bi service to elasticsearch through Logstash. it not send any data but create Index in the Elasticsearch

i create REST API for the power bi dataset. It is my config file to transfer data from the power bi service to elasticsearch : input { http_poller { urls => { powerbi_data => { method => get url =>…
0
votes
1 answer

How to make logstash file input work on windows machine?

I have a runningversion of logstash on my windows machine. The file input filter does not work. The logstash script starts and nothing happens -> no success or error. Generate input filter, and stdin all work properly. The file input filter stopped…
MMA
  • 41
  • 5
0
votes
0 answers

Logstash shutdown with an eexception

enter image description here I tried to install logstash and i run logstash -f ./conf/logstash-sample.conf command but it gives me that error i am new in ELK stack
0
votes
0 answers
0
votes
0 answers

How to see the DLQ messages in logstash?

I've configured the dead letter queue in logstash Logasth.yml dead_letter_queue.enable: true I didn't configure the path, so it uses the default path path.data/dead_letter_queue Test case: input{ // kafka } output{ // elastic } I'm…
0
votes
0 answers

How to configure logstash kafka with json deserializer? (Closed)

I have a config in my logstash. Where I need to consume the data from kafka topic and write into to elastic index. This is my conf input { kafka{ codec => json {} bootstrap_servers => "my_brokers" security_protocol => "SASL_PLAINTEXT" …
0
votes
1 answer

logstash mix json and plain content

I use logstash as a syslog relay, it forwards the data to a graylog and writes data to a file. I use the dns filter module to replace the IP with the FQDN and after this I can't write raw content to file, the IP is "json-ed". What I get…
lucas24007
  • 93
  • 1
  • 6
0
votes
1 answer

logstash file output not working with metadata fileds

I have following pipeline, the requirement is, I need to write "metrics" data to ONE file and EVENT data to another file. I am having two issues with this pipeline. The file output not creating a "timestamped file" every 30 seconds, instead it is…
kosa
  • 65,990
  • 13
  • 130
  • 167
0
votes
1 answer

Extracting file path from filebeat into logstash and then into elastic index

I'm trying to take a part of my path and give it as index to my elasticsearch index. my logstash config file looks like this (Note: the config file might be wrong cuz I tried 100 different things) # The # character at the beginning of a line…
Shino Lex
  • 487
  • 6
  • 22
0
votes
3 answers

Unable to start Logstash server and throwing error

I want to pass log file as an input to a Logstash input. I have added /bin to the environment variable path so that I can access it from anywhere. Below is my conf file: logstash.conf input{ path => "D:\nest\es-logging-example\log\info\info.log" …
Digvijay
  • 2,887
  • 3
  • 36
  • 86
0
votes
1 answer

logstash create new file every 30seconds

I have following filter configuration in my logstatsh pipeline. What it does is, at the start of the event first filter creates a CSV file with header and sets the file name to metadata. Second filter writes the output to above CSV. The challenge…
kosa
  • 65,990
  • 13
  • 130
  • 167
0
votes
1 answer

filebeat tomcat module and collect webapps logs files

I just installed filebeat on my remote server to collect logs by an app. Everything seems OK. The ELK stack retrieves the info and I can view it via Kibana. Today, I want to collect the logs generated by 2 webapps hosted on the same tomcat server. I…
anthony44
  • 345
  • 1
  • 4
  • 15
0
votes
1 answer

Logstash variable in pipeline config

I am setting up Logstash to ingest Airflow logs. The following config is giving me the output I need: input { file { path => "/my_path/logs/**/*.log" start_position => "beginning" sincedb_path => "/dev/null" } } filter { …
Flo
  • 377
  • 2
  • 15
0
votes
0 answers

how to split a document into separate fields using logstash

I am new to logstash where I am using file as input and elasticsearch as output. I am injecting a mongo document through file input and when it is inserted to elasicsearch, it is all added under a single field "message". But i want this document to…