2

Logstash 5.2.1

The configuration below is Ok, the partial updates are woking. I just misunderstood the results and how time zone is used by Logstash.

jdbc_default_timezone Timezone conversion. SQL does not allow for timezone data in timestamp fields. This plugin will automatically convert your SQL timestamp fields to Logstash timestamps, in relative UTC time in ISO8601 format. Using this setting will manually assign a specified timezone offset, instead of using the timezone setting of the local machine. You must use a canonical timezone, Europe/Rome, for example.


I want to index some data from a PostgreSQL to Elasticseach with help of Logstash. The partial updates should be working.

But in my case, Logstash puts the wrong time zone in ~/.logstash_jdbc_last_run.

$cat ~/.logstash_jdbc_last_run 
--- 2017-03-08 09:29:00.259000000 Z

My PC/Server time:

$date
mer  8 mar 2017, 10.29.31, CET
$cat /etc/timezone 
Europe/Rome

My Logstash configuration.:

input {
  jdbc {
    # Postgres jdbc connection string to our database, mydb
    jdbc_connection_string => "jdbc:postgresql://localhost:5432/postgres"
    # The user we wish to execute our statement as
    jdbc_user => "logstash"
    jdbc_password => "logstashpass"
    # The path to our downloaded jdbc driver
    jdbc_driver_library => "/home/trex/Development/ship_to_elasticsearch/software/postgresql-42.0.0.jar"
    # The name of the driver class for Postgresql
    jdbc_driver_class => "org.postgresql.Driver"
    jdbc_default_timezone => "Europe/Rome"
    # our query
    statement => "SELECT * FROM contacts WHERE timestamp > :sql_last_value"
    # every 1 min
    schedule => "*/1 * * * *"
  }
}
output {
  stdout { codec => json_lines }
  elasticsearch {
    hosts => [ "localhost:9200" ]
    index => "database.%{+yyyy.MM.dd.HH}"
  }
}

Without jdbc_default_timezone the time zone is wrong too.

My PostgeSQL data:

postgres=# select * from "contacts";                                                                                               uid |         timestamp          |          email          | first_name | last_name
-----+----------------------------+-------------------------+------------+------------
   1 | 2017-03-07 18:09:25.358684 | jim@example.com         | Jim        | Smith
   2 | 2017-03-07 18:09:25.3756   |                         | John       | Smith
   3 | 2017-03-07 18:09:25.384053 | carol@example.com       | Carol      | Smith
   4 | 2017-03-07 18:09:25.869833 | sam@example.com         | Sam        |
   5 | 2017-03-08 10:04:26.39423  | trex@example.com        | T          | Rex

The DB data is imported like this:

INSERT INTO contacts(timestamp, email, first_name, last_name) VALUES(current_timestamp, 'sam@example.com', 'Sam', null);

Why does Logstash put the wrong time zone in ~/.logstash_jdbc_last_run? And how to fix it?

srgbnd
  • 5,404
  • 9
  • 44
  • 80

2 Answers2

2

2017-03-08 09:29:00.259000000 Z mean UTC timezone, it's correct.

Đào Minh Hạt
  • 2,742
  • 16
  • 20
  • Ok, then why do I have the 5th DB record imported continuously? Look the update question. – srgbnd Mar 08 '17 at 09:40
  • Sorry, my previous comment is misleading. I have `jdbc_default_timezone => "Europe/Rome"` configured in Logstash. So, I have to have Rome timezone, haven't I? – srgbnd Mar 08 '17 at 09:42
  • `Timezone conversion. SQL does not allow for timezone data in timestamp fields. This plugin will automatically convert your SQL timestamp fields to Logstash timestamps, in relative UTC time in ISO8601 format.` So the timezone in after import always `UTC`, this setting only for converting – Đào Minh Hạt Mar 08 '17 at 09:46
  • Ok, then what do I need to configure to get the correct partial updates? – srgbnd Mar 08 '17 at 09:53
2

It is defaulting to UTC time. If you would like to store it in a different timezone, you can convert the timestamp by adding a filter like so:

filter {
    mutate {
    add_field => {
        # Create a new field with string value of the UTC event date
        "timestamp_extract" => "%{@timestamp}"
    }
    }

    date {
    # Parse UTC string value and convert it to my timezone into a new field
    match => [ "timestamp_extract", "yyyy-MM-dd HH:mm:ss Z" ]
    timezone => "Europe/Rome"
    locale => "en"
    remove_field => [ "timestamp_extract" ]
    target => "timestamp_europe"
    }
}

This will convert the timezone, by first extracting the timestamp into a timestamp_extract field and then converting it into Europe/Rome timezone. And the new converted timestamp is put in the timestamp_europe field.

Hope its clearer now.

Adnan H
  • 46
  • 3