2

As Kibana requires prefix underscores in the field keys- link (this issue seems unresolved), I am not able to process the field key values that come with default starting underscores (eg-journald logs for docker) in Kiabana. I am using logstash currently to push the logs to elasticsearch. I read this answer that uses ruby filter to remvove all underscores, but I guess this method would make my consumer very slow.

Is there a way to remove the prefix underscores from all the field names using the power of regex in logstash?

For example -

_HELO: World

should now change to be:

HELLO: World

probably by using a different pluggin than ruby

Akash
  • 939
  • 1
  • 8
  • 27
  • @SufiyanGhori I am not using fluentd for this. I am using fluentbit for my configuration. Thanks for replying thouhg. – Akash May 31 '18 at 10:09

3 Answers3

1

You can use the kv filter to remove prefix from keys using regex. It helps automatically parse messages (or specific event fields) which are of the foo=bar variety, and have configuration option remove_char_key that is used to remove string of characters from the key.

for instance, this will remove <, >, [, ] and , characters from keys

filter {
  kv {
    remove_char_key => "<>\[\],"
  }
}

Another option is remove_char_value which can be used to remove characters from values

for instance, this will remove <, >, [, ] and , characters from values

filter {
  kv {
    remove_char_value => "<>\[\],"
  }
}

These can be used with source to perform key=value searching on

filter { 
  kv { 
    source => "message" 
    remove_char_value => "<>\[\],"
    remove_char_key => "<>\[\],"
  } 
}

Please also have a look at trim_key and trim_value options

Sufiyan Ghori
  • 18,164
  • 14
  • 82
  • 110
0

If you just want to remove an underscore that's the first character of a field called fieldname, you can use:

  mutate {
    gsub => [
      "fieldname", "^_", ""
    ]
  }

See https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-gsub

baudsp
  • 4,076
  • 1
  • 17
  • 35
  • No. I want to do that for all the fieldname values that start with an `_`. Thanks for your answer though. I made the question more clear. – Akash May 31 '18 at 09:55
  • @Akash I was not sure it would be useful, but I preferred to write it just in case. The only solution (without `ruby`) I can think of would be to know the field names in advance and use mutate.copy. Good luck. – baudsp May 31 '18 at 12:33
-1

You'll need to resort to a ruby filter that looks for keys that start with _ and basically renames them (copy value to a new key and then remove original key):

filter {
  ruby {
    code => "
      event.to_hash.keys.each { |k|
         if k.start_with?('_')
          event.set(k[1,-1],event.get(k))
          event.remove(k)
       end
     }
   "
  }
}
Alcanzar
  • 16,985
  • 6
  • 42
  • 59
  • Thanks for replying. But as already mentioned I am not looking for a ruby filter. Seems it's not possible without it. – Akash Jun 04 '18 at 16:02
  • The answer you linked to is slow because it uses gsub specifically. Besides any existing filter is going to written in ruby. And logstash is good about compiling in the config file. – Alcanzar Jun 04 '18 at 16:40