0

I am using elapsed plugin to calculate time and aggregate plugin then to display it. I added custom fields to elapsed filter You can see it below:

 add_field => {
       "status" => "Status"
        "User" => "%{byUser}"
    }

One is static the other one is dynamic coming with event. On output of logstash it display only static values not dynamic one..

It displays %{byUser} for dynamic one. But for task id and status fields works just fine and I got right values.

Any idea why?

Little bit more code

elapsed {
    unique_id_field => "assetId"
    start_tag => "tag1:tag2"
    end_tag => "tag3:tag4"
    add_field => {
       "wasInStatus" => "tag3"
       "User" => "%{byUser}"
    }
    add_tag => ["CustomTag"]
  }

grok input:

 grok {
        match => [
            "message", "%{TIMESTAMP_ISO8601:timestamp} %{NUMBER:assetId} %{WORD:event}:%{WORD:event1} User:%{USERNAME:byUser}"]


if "CustomTag" in [tags] and "elapsed" in [tags] {
    aggregate {
      task_id => "%{assetId}"
       code => "event.to_hash.merge!(map)"
        map_action => "create_or_update"
    }
  }

problem is connected with: elapsed filter:

new_event_on_match => true/false

Change new_event_on_match to false was true in my pipeline fixed issue.but still wonder why.

user3884425
  • 91
  • 1
  • 1
  • 6
  • What is `byUser`? Is that a field in your current event? – Val Sep 07 '16 at 13:00
  • byUser this field is passed to every input and it is connected to every event. I found issue why this is happening property new_event_on_match => true/false if false proper value is passed to aggregate plugin if this flag is set to true then only %{byUser} this tag is passed. Still don't know why this helped – user3884425 Sep 07 '16 at 13:56
  • You'll need to show a bit more of your pipeline (anything relevant) if you want a chance to get your problem solved... Help us help you! – Val Sep 07 '16 at 13:57
  • added in main topic – user3884425 Sep 07 '16 at 15:25

1 Answers1

0

I also faced similar issue now, and found a fix for it. When new_event_on_match => true is used the elapsed event will be separated from the original log and a new elapsed event will be entered to the ElasticSearch as below

{
  "_index": "elapsed_index_name",
  "_type": "doc",
  "_id": "DzO03mkBUePwPE-nv6I_",
  "_version": 1,
  "_score": null,
  "_source": {
    "execution_id": "dfiegfj3334fdsfsdweafe345435",
    "elapsed_timestamp_start": "2019-03-19T15:18:34.218Z",
    "tags": [
      "elapsed",
      "elapsed_match"
    ],
    "@timestamp": "2019-04-02T15:39:40.142Z",
    "host": "3f888b2ddeec",
    "cus_code": "Custom_name", [This is a custom field]
    "elapsed_time": 41.273,
    "@version": "1"
  },
  "fields": {
    "@timestamp": [
      "2019-04-02T15:39:40.142Z"
    ],
    "elapsed_timestamp_start": [
      "2019-03-19T15:18:34.218Z"
    ]
  },
  "sort": [
    1554219580142
  ]
}

For adding the "cus_code" to the elapsed event object from the original log (log from where the elapsed filter end tag is detected), I added an aggregate filter as below:

if "elapsed_end_tag" in [tags] {
    aggregate {
       task_id => "%{execution_id}"
       code => "map['cus_code'] = event.get('custom_code_field_name')"
       map_action => "create"
    }
}

and add the end block of aggregation by validating the 'elapsed' tag

if "elapsed" in [tags] {
       aggregate {
          task_id => "%{execution_id}"
          code => "event.set('cus_code', map['cus_code'])"
          map_action => "update"
          end_of_task => true
          timeout => 400
      }
}

So to add custom field to elapsed event we need to combine aggregate filter along with elapse filter

krishnacm
  • 93
  • 8