0

I am new to ELK stack. My requirement is to read several .log files and analyze the data in Kibana.

In the log file, I have several occurrences of certain keyword, let's say "xyz".

Is there any way, I can create a field for this keyword ("xyz") in the logstash conf file ?

I have googled/youtube/read the materials but grok is using "WORD" pattern which is not going to help as all the String letters will come under "WORD" category.

Please help.

  • Could you add some specifics? e.g. dummy data from log files and the desired output? – aaaa Aug 09 '20 at 02:44
  • Hi @aaaa, Thanks for response Well, I can't put the exact file. However, Please consider the file in the below template - [xyz] today is monday [abc] I don't care about the day [xyz]Well you should do. and so on I want that xyz should be one of the fields. How can I achieve this ? Please let me know if any more clarification is required. Thanks – harendra pratap singh Aug 09 '20 at 13:44
  • Is there a finite number of [xyz]'s? Do you need all [xyz]'s in an array? I guess I'm just trying to understand how you want them grouped. It looks like you're showing us multiple events, in which case you might simply use a pattern to grab the timestamp followed by the field in question. If you're not interested in capturing all of them, is there anything to distinguish the one [xyz] that you're looking for in the log file? I'd like to help you out, but this problem seems too abstract as is. – aaaa Aug 09 '20 at 16:59
  • Also, please understand that the format (template) you've provided can easily be matched, but I suspect that it's not verbatim. What do your dates look like? Do all lines end with "" or do things simply jump down to the next line? If you could do a simple substitution on the text you've provided, it'd be helpful. I'm not sure if it's because you're worried about breaking NDA, but I don't think I could provide you anything that'd be remotely helpful to you unless you gave us a more complete example. – aaaa Aug 09 '20 at 17:11

0 Answers0