2

We have ELK setup with filebeat, logstash and elasticsearch and kibana. I need aggregate request and response at the logstash.

I have configured pipeline configuration as below. now log aggregation working without any issue if i use single worker for pipeline. if i use multiple workers aggregation is not happening. any solution to use multiple workers and log aggregation?

 if [transaction] == "request" { 

       aggregate {
                        task_id => "%{id}"
                        code => "
                        map['method'] = event.get('method')
                        map['request'] = event.get('request')
                        map['user'] = event.get('user')
                        map['application'] = event.get('application')"
                        map_action => "create"
         }
                    drop {}#drop the request before persisting, to save indexing space in elasticsearch server
  }
  if [message] =~ "TRANSACTION:response" {

         aggregate {
                    task_id => "%{id}"
                    code => "
                    event.set('method', map['method'])
                    event.set('response', map['response'])
                    event.set('user', map['user'])
                    event.set('application', map['application'])"
                    map_action => "update"

         }
}
techzone4all
  • 123
  • 2
  • 10

1 Answers1

0

For the aggregate filter to work you can only use one worker, if you use more than one worker your response event could be processed before your request, so your filter won't work.

This is docummented by elastic.

You should be very careful to set Logstash filter workers to 1 (-w 1 flag) for this filter to work correctly otherwise events may be processed out of sequence and unexpected results will occur.

leandrojmp
  • 7,082
  • 2
  • 19
  • 24