2

I have a Java application in which I am using Log4j2 to print Logs in JSONLayout, here is a sample of the logs format:

{
    "thread": "TopicStatusThreadPool-thread-1",
    "level": "INFO",
    "loggerName": "My Kafka Logger",
    "message": "Topic Status List is empty, returning from summarize method",
    "endOfBatch": false,
    "loggerFqcn": "org.apache.logging.slf4j.Log4jLogger",
    "instant": {
        "epochSecond": 1587980988,
        "nanoOfSecond": 214241000
    },
    "threadId": 37,
    "threadPriority": 5
  }

The logs printed in this format are then picked up by Fluent Bit and pushed into Elasticsearch. I am using Kibana to visualize the logs, but when I see the logs and the time as epochSecond and nonOfSecond, it becomes very difficult to relate it with the actual application logs because of the format.

Is there any Fluent Bit Filter which can be used to modify this time format to a more human readable format.

Currently I am using the basic JSON parser and Kubernetes Filter in Fluent Bit config for adding Kubernetes information to the Log messages.

UPDATE:

I made changes to Log4j2 configuration and now I am getting timeMillis field, which has time printed in logs as milliseconds.

{
    "thread": "TopicStatusThreadPool-thread-1",
    "level": "INFO",
    "loggerName": "My Kafka Logger",
    "message": "Topic Status List is empty, returning from summarize method",
    "endOfBatch": false,
    "loggerFqcn": "org.apache.logging.slf4j.Log4jLogger",
    "timeMillis": 1587980988213,
    "threadId": 37,
    "threadPriority": 5
  }

What will be the lua filter to convert this to human readable format. By default the time conversion in Fluent Bit doesn't support time in milliseconds, it wants time in seconds format.

I tried this:

[PARSER]
        Name   json
        Format json
        Time_Key timeMillis
        Time_Format %s
        Time_Keep On

But this doesn't pick the milliseconds part, treat the time in seconds.

iamabhishek
  • 437
  • 6
  • 17

2 Answers2

3

You could use lua filter as follows:

-- test.lua
function append_converted_timestamp(tag, timestamp, record)
    new_record = record
    new_record["instant"]["recorded_time"] = os.date("%m/%d/%Y %I:%M:%S %p", record["instant"]["epochSecond"])
    return 2, timestamp, new_record
end

FluentBit configuration:

[FILTER]
    Name    lua
    Match   *
    script  test.lua
    call    append_converted_timestamp

It will append a new field recorded_time with human readable date to your record.

UPDATE

Lua function for timeMillis field could be implemented like this:

-- test.lua
function append_converted_timestamp(tag, timestamp, record)
    new_record = record
    new_record["recorded_time"] = os.date("%m/%d/%Y %I:%M:%S %p", record["timeMillis"]/1000)
    return 2, timestamp, new_record
end
rmax
  • 41
  • 3
  • Hi, thanks for the answer. Can you update the lua filter for the field timeMillis, how I can handle with this? – iamabhishek Sep 17 '20 at 04:50
  • I guess you may just divide timeMillis in lua function to 1000 to get seconds. Something like this: `new_record["recorded_time"] = os.date("%m/%d/%Y %I:%M:%S %p", record["timeMillis"]/1000)` – rmax Oct 13 '20 at 20:30
0

As author has mentioned, Lua is not supporting milliseconds format. @rmax answer is good but it doesn't catch the milliseconds part and add to the record. I also unable to find a direct approach, but nailed a workaround.

function append_converted_timestamp(tag, timestamp, record)
    new_record = record
    convertedToDecimal = string.format("%.3f", record["epoch_time"]/1000)
    local integer, decimal = string.match(convertedToDecimal, "([^.]+)%.(.+)");
    convertedDate = os.date("%Y-%m-%dT%I:%M:%S", integer)
    new_record["time"] = convertedDate .. "." .. decimal
    record["epoch_time"] = nil
    return 1, timestamp, record
end

The epoch_time is resemble with your timeMillis field. What I've additionally done is, removed the original epoch_time field and add a new field name time. Otherwise final result will have duplicate time fields.

AnujAroshA
  • 4,623
  • 8
  • 56
  • 99