8

I am using fluentd to centralize log messages in elasticsearch and view them with kibana. When I view log messages, messages that occured in the same second are out of order and the milliseconds in @timestamp is all zeros

2015-01-13T11:54:01.000-06:00   DEBUG   my message

How do I get fluentd to store milliseconds?

David Wartell
  • 726
  • 1
  • 8
  • 13

2 Answers2

14

fluentd does not currently support sub-second resolution: https://github.com/fluent/fluentd/issues/461

I worked around this by adding a new field to all of the log messages with record_reformer to store nanoseconds since epoch

For example if your fluentd has some inputs like so:

#
# Syslog
#
<source>
    type syslog
    port 5140
    bind localhost
    tag syslog
</source>

#
# Tomcat log4j json output
#
<source>
    type tail
    path /home/foo/logs/catalina-json.out
    pos_file /home/foo/logs/fluentd.pos
    tag tomcat
    format json
    time_key @timestamp
    time_format "%Y-%m-%dT%H:%M:%S.%L%Z"
</source>

Then change them to look like this and add a record_reformer that adds a nanosecond field

#
# Syslog
#
<source>
    type syslog
    port 5140
    bind localhost
    tag cleanup.syslog
</source>

#
# Tomcat log4j json output
#
<source>
    type tail
    path /home/foo/logs/catalina-json.out
    pos_file /home/foo/logs/fluentd.pos
    tag cleanup.tomcat
    format json
    time_key @timestamp
    time_format "%Y-%m-%dT%H:%M:%S.%L%Z"
</source>

<match cleanup.**>
    type record_reformer
    time_nano ${t = Time.now; ((t.to_i * 1000000000) + t.nsec).to_s}
    tag ${tag_suffix[1]}
</match>

Then add the time_nano field to your kibana dashboards and use it to sort instead of @timestamp and everything will be in order.

David Wartell
  • 726
  • 1
  • 8
  • 13
  • 1
    Thanks for the answer. A Fluentd maintainer here. I will keep this issue in mind as we think more about the sub-second timestamp support (it's a known issue/design decision). – Kiyoto Tamura Jan 14 '15 at 22:12
  • Thanks for the attention to this issue Kiyoto Tamura. The workaround is less than ideal because the timestamp is generated from fluentd vs from a log file which may already have at least milliseconds precision. It would be best to first use milliseconds precision to time format and then add the current nanosecond value from current second in fluentd at time of parsing to keep order in the same millisecond. Or second if the parsed log message only had 1 second resolution like syslog for example. – David Wartell Jan 15 '15 at 23:05
  • 1
    Hi @DavidWartell, do you think it would be better to use the variable ${time} from fluent-plugin-record-reformer? So instead of `Time.now` we can get the time of the event instead of fluentd's time. – clarete Jan 19 '15 at 21:23
  • @DavidWartell, Thanks for sharing your solution. Had the same problem. @clarete, I was initially using just `${time}` which obviously doesn't work. I tried substituting `Time.now` with `${time}` in David Wartell's solution and that doesn't work either. The last few digits are all zero. I think that is because `${time}` doesn't store beyond seconds. – ksrini Feb 06 '15 at 10:41
  • is there a specific reason you are converting it to string? why not just return the value as as? – mohamnag Jun 21 '16 at 10:16
  • I documented a better solution here: http://work.haufegroup.io/log-aggregation/#timestamp-fix, maybe check it out – dutzu Jul 14 '17 at 05:36
  • @dutzu you might want to consider adding that as a reply as well. As a comment, it is easily missed (It took me a while to notice it). – Yoav Gur Feb 05 '18 at 15:05
0

Struggled with this issue using Spring, Java and FluencyLogbackAppender from logback-more-appenders.

In logback-spring.xml, in order to get milliseconds, set:

            <useEventTime>true</useEventTime>

and then in fluentd I had to add a filter to replace @timestamp with time (set in FluencyLogbackAppender containing milliseconds).

<filter **>
  @type record_transformer
    enable_ruby
  <record>
    @timestamp ${time.strftime('%Y-%m-%dT%H:%M:%S.%3N%z')}
  </record>
</filter>
charlb
  • 1,076
  • 9
  • 18