1

I'm using fluentbit to parse logs. I have logs in next format:

{"key1":"value1","key2":"{\n \"date\": \"2021-07-05 13:58:20.501636\",\n    \"timezone_type\": 3,\n    \"timezone\": \"UTC\"\n}", "key3":"{ \n \"somedata\": \"somevalue\" "}

In ES and Kibana I get something like that:

key1: value1
key2: {
        date: 2021-07-05 13:58:20.501636
        timezone_type: 3
        timezone: UTC
      }
key3: {
        somedata: somevalue
      }

So, some fields parsing, but I want to parse all fields, like

key1: value1
key2.date: 2021-07-05 13:58:20.501636
key2.timezone_type: 3
key2.timezone: UTC
key3.somedata: somevalue

using fluentbit config:

[FILTER]
    Name parser
    Parser api
    Match *
    Reserve_Data On
    Reserve_Key On
    Key_Name log
    Merge_Log  on
    Merge_JSON_Key log
[PARSER]
    Name   api
    Format json
    Time_Key date
    Time_Format %Y-%m-%d %H:%M:%S.%u
    Time_Keep On

I tried to decode fields with Decoders like Decode_Field_As escaped log but nothing, log writing in same format.

ivanovUA
  • 79
  • 6

2 Answers2

1

Two potential issues:

  1. The issue could be with the FILTER that is being used. Instead of Merge_JSON_Key log try Merge_Log_Key log_processed. Note we changed the value to be log_processed too

    [FILTER]
        Name parser
        Parser api
        Match *
        Reserve_Data On
        Reserve_Key On
        Key_Name log #Not sure if this is necessary??
        Merge_Log  on
        Merge_Log_Key log_processed
    
  2. If that doesn't work then its probably data related. The json data is being sent to logs as a string object rather than json by the look of things.

Check my other answer here which explains how we solved a similar issue: How to split log (key) field with fluentbit?

wjkw1
  • 114
  • 6
-1

You should use Decode_Field_As json log in place of Decode_Field_As escaped log

It will decode the JSON into different fields in kibana.

S.B
  • 13,077
  • 10
  • 22
  • 49
gaurav sharma
  • 91
  • 1
  • 10