0

I have different patterns of logs in the same which i am loading to logstash. I am writing multiple grokpatterns to satisfy the problem at hand. currently this works, but looking for an efficient way to break the log into key value pairs by using GROK patterns or some other way. Any suggestions or help here.

filter { grok { match => {"message"=> ["%{TIMESTAMP_ISO8601:log_timestamp} %{GREEDYDATA:thread}, hostname=%{WORD:hostname}, level=%{WORD:level} , logger=%{WORD:logger}, et=%{NOTSPACE:et},%{GREEDYDATA:loginfo}, cid=%{WORD:cid}, workitem=%{NOTSPACE:workitem}, dev_id=%{NOTSPACE:dev_id}, model=%{GREEDYDATA:model}, customer_id=%{NOTSPACE:customer_id}, customer_name=%{GREEDYDATA:customer_name}, conn_id=%{NOTSPACE:conn_id}, ipaddr=%{IP:ipaddr}, hostname=%{GREEDYDATA:hostname}, serial_num=%{NOTSPACE:serialnumber}, model_support=%{NOTSPACE:model_support}, model_num=%{NOTSPACE:model_num}, mSKU=%{WORD:mSKU}, ocv_status=%{WORD:ocv_status}, mcid=%{GREEDYDATA:mcid}", "%{TIMESTAMP_ISO8601:log_timestamp} %{GREEDYDATA:thread}, hostname=%{WORD:hostname}, level=%{WORD:level} , logger=%{WORD:logger}, et=%{NOTSPACE:et},%{GREEDYDATA:log}", "%{TIMESTAMP_ISO8601:log_timestamp} %{GREEDYDATA:thread}, hostname=%{WORD:hostname}, level=%{WORD:level} , logger=%{WORD:logger},%{GREEDYDATA:log}"]} } }

Durga
  • 75
  • 1
  • 1
  • 7
  • I could figure out by using KV filter. https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html – Durga Aug 29 '17 at 20:18
  • if kv is not powerful enough, but you do not need regular expressions, the dissect filter might be interesting to you as well https://www.elastic.co/guide/en/logstash/5.5/plugins-filters-dissect.html – alr Aug 29 '17 at 20:23

0 Answers0