I have some costum log files I would like to parse so I can feed them in logstash. I built a grok pattern to parse them but admitedly I'm not very proficient in grock nor regex, I wanted to ask if what I did could somehow be simplified/optimized.
Logs example:
system_info : Calculator[1]_Global@HKGL1V5KY33 (23572.0000000007A668B0) : [2020/10/14-15:43:17.975] : GridTaskProcessor::mainThreadRun() : Routing criteria : GridServiceName = SophisMonteCarlo system_warning : Calculator[1]_Global@HKGL1V5KY33 (23572.0000000007A668B0) : [2020/10/14-15:43:07.840] :vDBFees::loadFeesDetailsMapping() : Broker detail fees mapping begin system_info : NamingServer[standard]@BMALAN (10276.000000001C8DA890) : [2020/08/05-15:04:13.426] : GenericServiceLoader::InitGenericServices() : Initializing generic service 'MonitoringHubConnector'... system_info : ServerAdminConsole[standard]@BMALAN (8880.000000001B7A75D0) : [2019/05/31-15:04:23.240] : ServerAdminConsole::backgroundWorker_DoWork() : Initializing Entries from naming service....
Grok Pattern:
\s*(?<verbosity>(.*?)(?=\ :))\s*:\s*(?<servicename>(.*?)(?=\@))@(?<servername>(.*?)(?=\ ))\s*\((?<threadid>(.*?)(?=\)))\)\s*:\s*\[(?<date>(.*?)(?=\-))\-(?<time>(.*?)(?=\]))\]\s*:\s*(?<class>(.*?)(?=\())\(\)\s*:\s*%{GREEDYDATA:message}