2

I'm feeding Logstash with Kafka input. The messages look like this:

        "_index" : "progress",
        "_id" : "Q27Y2IYBIZUq2eJ6WJQR",
        "_score" : 1.0,
        "_source" : {
          "itemId" : 3,
          "weight" : 358,
          "timeStartedInMillis" : 39131,
          "flow" : 3725,
          "@version" : "1",
          "@timestamp" : "2023-03-13T02:41:42.313784Z",
          "event" : {
            "original" : "{\"timeStartedInMillis\": 39131, \"procedureId\": 3, \"temperature\": 47, \"weight\": 358, \"flow\": 3725}"
          },
          "type" : "log",
          "temperature" : 47,
          "tags" : [
            "kafka_source"
          ]
        }

How can I filter the message to send as an output only this:

    {
        "_index" : "progress",
        "itemId" : 3,
        "weight" : 358,
        "timeStartedInMillis" : 39131,
        "flow" : 3725,
        "temperature" : 47,
    }
Martin Dvoracek
  • 1,714
  • 6
  • 27
  • 55

1 Answers1

2

You can use the logstash prune filter to only send specific fields to elasticsearch.

Logstash.conf:

input {...}
filter {
  prune {
    whitelist_names => [ "itemId", "temperature", "${some}_field" ]
  }
}
output {...}

Another solution is to use the drop filter if you need it.

Musab Dogan
  • 1,811
  • 1
  • 6
  • 8
  • It is also possible to use the [`mutate/remove_field` filter](https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-remove_field) – Val May 22 '23 at 09:39