1

I am new to ELK stack. I am trying to analyze and process java stack logs from my application to get useful information. In my case after starting the logstash, I can see my data under Discover tab on Kibana portal, but as soon I am trying to create a visualization I am not getting all fields there. Also the data is inconsistent. Can someone please clarify what is wrong in my case. Below are my snippets.

Sample log file:

Thread #1: t@-1824503488, lwp=10848, ref=0xb8535a0, session=84521D86BBE4D71690E2A6EF29A489BE:mx135721697661ec76bb:(WebServiceFacade.java:84), ms=0xa0819a40
<Start Stack Trace>
  <1 - ADK Verbose Trace Entry>
  stateless dispatch for invokeClass.bosInterface executing
  Active: 18 minutes  50.00 seconds 
  User:
    User1
  Tenant:
    
  Session:
    84521D86BBE4D71690E2A6EF29A489BE:mx135721697661ec76bb:(WebServiceFacade.java:84)
  Parameters:
    bosContext _cntx:
      user:
        User Ag
      depth:
        3
      session id:
        84521D86BBE4D71690E2A6EF29A489BE:mx135721697661ec76bb:(WebServiceFacade.java:84)
    bosUTF _className:
      XXXXXProcess
    bosStringList _construct:
      2 entries
          $$MXRIP$$|java.util.HashMap
          10
    bosUTF _methodName:
      YYYYYYCopies
    bosStringList _params:
      2 entries
          $$MXRIP$$|java.util.HashMap
          9 

logstash.conf file

    input {
        file {
            path => "C:/Temp/BOHLogs/SampleJava.log"
            start_position => "beginning"
            sincedb_path => "NUL" 
            codec => multiline {
                pattern => "^Thread%{SPACE}%{GREEDYDATA}"
                negate => "true"
                what => "previous"
            }
        }
    }
    filter {
        grok {
            match => [ "message", "%{NOTSPACE:Thread}:\s%{GREEDYDATA:Data}" ]
        }
        kv {
            source => "Data"
            field_split => " "
            value_split => "="
        }
        kv {
            source => "Data"
            field_split => "\n"
            value_split => ":"
        }
        date {
            match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS Z", "MMM dd, yyyy HH:mm:ss a"]
        }
    }
    output {
        elasticsearch {
            hosts => ["localhost:9200"]
            index => "dummya_log"
        }
        stdout {codec => rubydebug}
    }

Output in logstash window Available fields on logstash window

Available fields in Visualize tab

  • 1
    See the guide below. [tour](https://stackoverflow.com/tour) / [how-to-ask](https://stackoverflow.com/help/how-to-ask) / [DO NOT post images of code, data, error messages, etc. - copy or type the text into the question.](https://meta.stackoverflow.com/questions/285551/why-not-upload-images-of-code-errors-when-asking-a-question/285557#285557) The error log in your logstash window can be expressed as text. – myeongkil kim Feb 10 '21 at 10:09

0 Answers0