0

I am using jdbc plugin to fetch data from postgresql db, it seems to be work fine for entire export and i am able to pull the data, but it is not working according to saved state, everytime all of data is queried and there are lot of duplicates.

I checked the .logstash_jdbc_last_run. The metadata state is updated as required, still plugin is importing entire data from table on every run. If any thing wrong in config.

input 
{
jdbc {
jdbc_connection_string => "jdbc:postgresql://x.x.x.x:5432/dodb"
jdbc_user => "myuser"
jdbc_password => "passsword"
jdbc_validate_connection => true
jdbc_driver_library => "/opt/postgresql-9.4.1207.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "select id,timestamp,distributed_query_id,distributed_query_task_id, "columns"->>'uid' as uid, "columns"->>'name' as name from distributed_query_result;"
schedule => "* * * * *"
use_column_value => true
tracking_column => "id"
tracking_column_type => "numeric"
clean_run => true
}
}
output
{
kafka{
topic_id => "psql-logs"
bootstrap_servers => "x.x.x.x:9092"
codec => "json"
}
}

Any help !! Thanks in advance,, I used below doc for reference. https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html

hsg09
  • 1
  • 1

0 Answers0