2

I'd need help to understand how to set the @timestamp field with the content of another field that contains date and time with nanoseconds precision. I've tried to use the date match but it's not clear to me which pattern must be used for the scope:

date {
    match => ["timestamp_nano", "ISO8601"]
    target => "@timestamp"
}

where timestamp_nano is timestamp with nanoseconds precision (e.g. "2022-01-20T12:00:00.123456789Z").

By using ISO8601 there is a milliseconds precision and the rest of the digit are not reported in the @timestamp field.

Which is the pattern to be used in the match?

Thanks in advance

alfven
  • 147
  • 8

3 Answers3

1

This answer is for 8.0.0-rc1, since 8.0 is not released yet. The date filter is still using the Joda library, which is limited to millisecond precision. The PR that introduced nanosecond precision fixes the underlying LogStash Timestamp class to be able to use nanosecond precision, but not all the classes that use it.

If I run

input { generator { count => 1 lines => [ '' ] } }
output { stdout { codec => rubydebug { metadata => false } } }

then I get microsecond precision for @timestamp

"@timestamp" => 2022-01-22T01:14:15.881459Z,

However, timestamps can be more accurate than that. If I use a json codec

input { generator { count => 1 lines => [ '{ "@timestamp": "2022-01-10T12:13:14.123456789"}' ] codec => json } }
output { stdout { codec => rubydebug { metadata => false } } }

I get nanosecond precision

"@timestamp" => 2022-01-10T17:13:14.123456789Z,

If you want to use the value of another field you could do something like

input { generator { count => 1 lines => [ '' ] } }
filter {
    mutate { add_field => { "foo" => "2022-01-10T12:13:14.123456789" } }
    mutate { add_field => { "bar" => '{ "@timestamp": "%{foo}" } ' } }
    json { source => "bar" }
}
Badger
  • 3,943
  • 2
  • 6
  • 17
1

Excellent answer above by @badger but I prefer using a single step with Ruby in Logstash 8.x

input { generator { count => 1 lines => [ '' ] } }

filter {
  mutate { add_field => { "[@timestamp_nanoseconds]" => "2022-08-11T10:10:10.123456789Z" } }
  ruby {
    code => "event.set('[@timestamp]', LogStash::Timestamp.new(event.get('[@timestamp_nanoseconds]')) )"
  }
}

output { stdout {} }

I get native @timestamp in nanoseconds precission

"@timestamp" => 2022-08-11T10:10:10.123456789Z,
Juan Domenech
  • 336
  • 3
  • 5
0

I got the same result by adding a pipeline in Elasticsearch:

{
        "description": "mypipeline",
        "processors": [
            {
                "date": {
                    "field": "timestamp_nano",
                    "formats": ["ISO8601"],
                    "output_format": "yyyy-MM-dd'T'HH:mm:ss.SSSSSSSSSZ"
                }
            }
        ]
}

The timestamp_nano includes a timestamp with nanoseconds precision.

In LogT, the @timestamp can be updated by using a ruby script as indicated here and it works but the author says that the script could be not optimised.

BRs

alfven
  • 147
  • 8
  • That is a nice solution if you run elasticsearch, but this is not tagged as an elasticsearch question. I doubt I am the only person who runs logstash without ever using elasticsearch :) – Badger Aug 13 '22 at 02:55