I'm new to the log-stash and grok, I need to parse very custom log files. I can't find anywhere a good tutorial to get this done. Tried the syslog example but it's not working in my case.
Example:
Nov 19 00:06:37 srv-fe-05 ssh-server-g3: 2037 Sft_server_open_dir, "2037 Sft_server_open_dir, Directory name: /home/folder1/input, File handle: 007800B000782170, "Success", Session-Id: 162351"
The items I'm looking to extract:
- Timestamp: Nov 19 00:06:37 will be added to the current year and stored in the elastic search as timestamp.
- Server host srv-fe-05
- Folder name folder1 from /home/folder1/input
- Success status "Success"
- Session-Id: 162351 from Session-Id: 162351
Any help or directions would be appreciated.
Following the answer I came up with this pattern:
%{SYSLOGTIMESTAMP:logTimestamp} %{USERNAME:sftpServer} %{USERNAME:processName}: %{INT:operationType} %{WORD}, \"%{INT} %{WORD}, %{WORD} %{WORD}: /%{WORD}/%{WORD:clientName}/%{WORD}, %{WORD} %{WORD}: %{WORD:submissionId}, \"%{WORD:status}\", %{WORD}-%{WORD}: %{INT:sessionId}
My 2 new questions are:
- How efficient is it? I mean what are the ways to make it more efficient?