I have a Java application in which I am using Log4j2 to print Logs in JSONLayout, here is a sample of the logs format:
{
"thread": "TopicStatusThreadPool-thread-1",
"level": "INFO",
"loggerName": "My Kafka Logger",
"message": "Topic Status List is empty, returning from summarize method",
"endOfBatch": false,
"loggerFqcn": "org.apache.logging.slf4j.Log4jLogger",
"instant": {
"epochSecond": 1587980988,
"nanoOfSecond": 214241000
},
"threadId": 37,
"threadPriority": 5
}
The logs printed in this format are then picked up by Fluent Bit and pushed into Elasticsearch. I am using Kibana to visualize the logs, but when I see the logs and the time as epochSecond and nonOfSecond, it becomes very difficult to relate it with the actual application logs because of the format.
Is there any Fluent Bit Filter which can be used to modify this time format to a more human readable format.
Currently I am using the basic JSON parser and Kubernetes Filter in Fluent Bit config for adding Kubernetes information to the Log messages.
UPDATE:
I made changes to Log4j2 configuration and now I am getting timeMillis field, which has time printed in logs as milliseconds.
{
"thread": "TopicStatusThreadPool-thread-1",
"level": "INFO",
"loggerName": "My Kafka Logger",
"message": "Topic Status List is empty, returning from summarize method",
"endOfBatch": false,
"loggerFqcn": "org.apache.logging.slf4j.Log4jLogger",
"timeMillis": 1587980988213,
"threadId": 37,
"threadPriority": 5
}
What will be the lua filter to convert this to human readable format. By default the time conversion in Fluent Bit doesn't support time in milliseconds, it wants time in seconds format.
I tried this:
[PARSER]
Name json
Format json
Time_Key timeMillis
Time_Format %s
Time_Keep On
But this doesn't pick the milliseconds part, treat the time in seconds.