Questions tagged [logstash-jdbc]

187 questions
2
votes
1 answer

Logstash Configuration Error - jdbc_driver_library is not set

I am using Logstash to move data from my Microsoft SQL Server database to ElasticSearch. I receive the following error in the log files when I try to run logstash. I run: sudo -Hu logstash /usr/share/logstash/bin/logstash…
PanczerTank
  • 1,070
  • 4
  • 14
  • 31
2
votes
1 answer

TypeError: no implicit conversion of Integer into String

In my pipeline this problem occurs: TypeError: no implicit conversion of Integer into String I am using centos7 and I have installed it by yum # Sample Logstash configuration for creating a simple input { jdbc { jdbc_connection_string =>…
2
votes
0 answers

Avoid joins by multiple select statements - Logstash

I'm using Logstash to migrate data from mysql to elasticsearch. My mysql database has a primary table called product that has many relations the query to select it contains around 46 left outer join and the result returned is very huge (50k) rows…
2
votes
1 answer

Logstash Oracle JDBC (Failed to load ojdbc8.jar)

I configure logstash connection for oracle but not working. Error: failed to load c:\ojdbc8.jar Do you have any solution for this situation? input { jdbc { jdbc_connection_string => "jdbc:oracle:thin:@192.168.10.10:1521/TESTDB" jdbc_user…
Emre Sts
  • 95
  • 1
  • 6
2
votes
1 answer

Delete old documents from Elastic Search using logstash

I am using logstash to index data from postgres(jdbc input plugin) into elasticsearch. I don't have any time based information in the database. Postgres table users to import has 2 columns - userid(unique), uname Elastic search export - _id =…
jsanjayce
  • 272
  • 5
  • 15
2
votes
1 answer

Logstash JDBC tracking column value not latest timestamp

Database Given the following PostgreSQL table test (some columns omitted, e.g. data which is used in the pipeline): id (uuid) | updated_at (timestamp with time zone) 652d88d3-e978-48b1-bd0f-b8188054a920 | 2018-08-08…
Markus Ratzer
  • 1,292
  • 3
  • 19
  • 29
2
votes
1 answer

logstash error : Error registering plugin, Pipeline aborted due to error ()

I'm a beginner on ELK and trying to load data from MySQL to elasticsearch(for next step I want to query them via javarestclient), so I used logstash-6.2.4 and elasticsearch-6.2.4. and followed an example here. when I run: bin/logstash -f…
Chu
  • 163
  • 1
  • 4
  • 9
2
votes
2 answers

Replacing @timestamp using datetime from JDBC input

How does someone replace @timestamp field in a Logstash pipeline without converting DateTime to a string and then doing a date filter on that column? mutate { convert => ["datetime", "string"] } date { match => ["datetime", "ISO8601"] }
Evaldas Buinauskas
  • 13,739
  • 11
  • 55
  • 107
2
votes
2 answers

Import data from mysql to elasticsearch using logstash

I would like to import data from my MySQL database to elasticsearch with logstash. I am already able to import a custom query result, but I am missing the point where I can define the mapping/settings of the index being created by logstash. Also, I…
RoyRobsen
  • 457
  • 1
  • 9
  • 20
2
votes
0 answers

How to define mapping in logstash for SQL attributes

I use logstash to index data from a database (in this case Postgres) and put it in an Elasticsearch index. This is my config: input { jdbc { jdbc_driver_library => "/path/to/driver" jdbc_driver_class => "org.postgresql.Driver" …
2
votes
2 answers

Why does Logstash put the wrong time zone in ~/.logstash_jdbc_last_run?

Logstash 5.2.1 The configuration below is Ok, the partial updates are woking. I just misunderstood the results and how time zone is used by Logstash. jdbc_default_timezone Timezone conversion. SQL does not allow for timezone data in timestamp…
srgbnd
  • 5,404
  • 9
  • 44
  • 80
2
votes
1 answer

Unbelievably slow indexing in ElasticSearch

We decided to include search engine in our product. And comparing ElasticSearch and Solr. When we started work with Elastic 2.3.3. We faced with the problem of slow indexing. We feed elastic using Logstash, and indexing of table with 4000000…
inatoff
  • 151
  • 2
  • 12
2
votes
1 answer

Logstash JDBC Input plugin : Migrate data from mysql in batch count

I have a table of 20GB data having 50 million rows. Need to migrate to ElasticSearch using logstash jdbc input plugin. I have tried all basic implementation but need help in migrating data in batch i.e only 10,000 rows at a time. I am not sure how…
Chitra
  • 41
  • 1
  • 4
2
votes
1 answer

How to print the logs of logstash execution

I have been trying to search this online, but couldn't get any lead. Is there a way we can print logstash execution output to a log file? For example, I am using a jdbc plugin to read data as per sql_last_start. I want to know at what time the query…
Crickcoder
  • 2,135
  • 4
  • 22
  • 36
1
vote
0 answers

How to get diff by comparing 2 json field in Logstash

I tried to get the difference between these two JSON objects coming from JDBC in Logstash. example JDBC JSON: before = '{"heroes":[{"id":1,"name":"pudge"},{"id":2,"name":"slark"},{"id":3,"name":"techies"}]}' after =…
1
2
3
12 13