0

I generate a JSON file every 5 mins through a Python code and try to pushing data to Elastic, but the Logstash throw following message and doesn't push any data to Kibana.

My pipeline: File --> Logstash --> Elastic --> Kibana

JSON file output:

{
    "platform": "Computer",
    "mode": "Live",
    "users": 899
}

Log Message:

[INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

My Logtash.conf file

input {
        file {
                path => "D:/elk/logs_folder/test.json"
                start_position => "beginning"
                sincedb_path => "NUL"
                codec => "json"
        }
}

filter {
  json {
    skip_on_invalid_json => true
    source => "message"
    target => "jsonData"
    add_tag => [ "_message_json_parsed" ]    
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "in_elk_test"
  }

  stdout{

  }
}

I'm getting following error when run Logstash

Log:

[INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.9.0) {:es_version=>8}
[WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"in_elk_test"}
[INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/logstash-simple.conf"], :thread=>"#<Thread:0x50d159d0@D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.05}
[INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/data/plugins/inputs/file/.sincedb_f2779ebeeb58467d208ce626cfa73491", :path=>["D:/elk/logs_folder/logs1.json"]}
[INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[INFO ][filewatch.observingtail  ][main][223ec84e00c300043960ade7a8b1b9aa2a896b167223b1bf197e641e0ac119cd] START, creating Discoverer, Watch with file and sincedb collections
[INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

After the statement, Logstash doesn't parse my JSON file and not pushing data to index.

enter image description here

Please help me on figuring out the issue and address it, Thanks in advance!

M.A.Murali
  • 9,988
  • 36
  • 105
  • 182

1 Answers1

0

[INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/data/plugins/inputs/file/.sincedb_f2779ebeeb58467d208ce626cfa73491", :path=>["D:/elk/logs_folder/logs1.json"]}

In your file input you also need to set a sincedb_path to make sure that you read your file from the beginning, otherwise if you've started Logstash a few times, it will read from the end of the file

    file {
            path => "D:/elk/logs_folder/test.json"
            start_position => "beginning"
            sincedb_path => "NUL"
            codec => "json"
    }
Val
  • 207,596
  • 13
  • 358
  • 360
  • I have added sincedb_path => "NUL", but still facing same issue and stuck at the same place Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}, Could you check? – M.A.Murali Aug 14 '23 at 21:14
  • Since you have `skip_on_invalid_json => true` are you certain that your file contains valid JSON data, one record per line? – Val Aug 15 '23 at 03:18
  • My JSON file isn't in single line, its in 3 lines as shown in the question – M.A.Murali Aug 15 '23 at 06:00
  • Oh, my bad, I overlooked it, that's why it doesn't work then, JSON logs need to be on a single line – Val Aug 15 '23 at 06:16
  • I have changed the JSON file as single line {"platform": "Computer", "mode": "Live", "users": 226} -- still same issue and I have tried again without filter plugin, still no luck, getting same line, no parse after that – M.A.Murali Aug 15 '23 at 07:26
  • Is there a newline after each line? i.e. each JSON MUST be one its own line – Val Aug 15 '23 at 07:30
  • No newline in the JSON file. – M.A.Murali Aug 15 '23 at 09:54
  • Then add some and it will work – Val Aug 15 '23 at 15:37