0

I've already got my index (response_summary) created using logstash, which puts data into the index from a MySQL database.

My concern here is, how will I be able to update the index manually whenever a new set of records are being added to the database without deleting and recreating the index yet again.

Or is there a way that it can be done automatically, whenever a db change is done?

Any help could be appreciated.

Kulasangar
  • 9,046
  • 5
  • 51
  • 82
  • 2
    You need your own way of finding out when your sql database changes (regular queries against it or similar) OR when you change something in the DB you do the same in ES. – Andrei Stefan Oct 25 '16 at 20:14
  • @AndreiStefan I could do it through logstash by inserting a scheduler cron job, but wanted to know if there's any other way to do it using ES. Thanks – Kulasangar Oct 26 '16 at 04:40
  • 1
    No way with ES. There were the rivers in ES, but they were removed in ES 2.0. The alternative is the Logstash JDBC input plugin: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html – Andrei Stefan Oct 26 '16 at 04:46
  • For db handling that's fine, what if i'm uploading log files as input, could I use the same `scheduler` within the input? – Kulasangar Oct 26 '16 at 04:55
  • 1
    No, you use the file input from Logstash which picks up file changes automatically. – Andrei Stefan Oct 26 '16 at 04:58
  • Thanks @AndreiStefan So even if the log file, gets new records appended at the end it would automatically tail and get the newly added data as well. ? I don't have to to do a cron job as such. – Kulasangar Oct 26 '16 at 05:00
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/126688/discussion-between-kulasangar-and-andrei-stefan). – Kulasangar Oct 26 '16 at 05:04

1 Answers1

1

No way with ES. There were the rivers in ES, but they were removed in ES 2.0. The alternative is the Logstash JDBC input plugin to automatically pickup changes based on a defined schedule.

For doing the same with files, you have the LS file input plugin which is tailing the files to pick up the new changes and, also, to keep track of where it left off in case LS is restarted.

Andrei Stefan
  • 51,654
  • 6
  • 98
  • 89
  • Thanks, spot on :) – Kulasangar Oct 26 '16 at 05:27
  • I tried with the `jdbc schedule` in logstash conf, but when i checked it with Kibana it only takes the last record which I added, not the whole! I added something like this: `schedule => "* * * * *"` – Kulasangar Oct 26 '16 at 08:29
  • 1
    And the query used covers all the data? – Andrei Stefan Oct 26 '16 at 08:35
  • Sorry @Andrei, it was my mistake. The `document_id` was wrong. It works perfectly now! – Kulasangar Oct 26 '16 at 10:19
  • @AndreiStefan, i am facing the same issue, except when i try to do an update, any new row is getting updated in the index, the old row updation is also added as a new row. I want the old index row to get updated. Ive also posted a detailed question [here](https://stackoverflow.com/questions/57625400/jdbc-update-does-not-update-existing-rows) . Could you please help me? – AlisonGrey Sep 11 '19 at 10:37