0

I would like to automate the process of updating the elasticsearch with latest data on demand and secondly, recreating the index along with feeding data using a Jenkins job.

I am using jdbc input plugin for fetching data from 2 different databases (postgresql and microsoft sql). When the Jenkins job is triggered on demand, the logstash should run the config file and do the tasks we would like to achieve above. Now, we also have a cronjob running on the same sever (AWS) , where the logstash job would be running on demand. The issue is, the job triggered via Jenkins, starts another logstash process along with the cron job running logstash already on the AWS server. This would end up starting multiple logstash processes without terminating them, once on demand work is done.

Is there a way to achieve this scenario? Is there a way to terminate the logstash running via Jenkins job or if there's some sort of queue that would help us insert our data on demand logstash requests?

PS: I am new to ELK stack

user977815
  • 45
  • 3
  • 8
  • I don't think starting and stopping logstash is the right way to do it. You could leave logstash running and push your logs to it with Jenkins (with curl on an http input/writing to a file with a file input...) – baudsp Nov 22 '17 at 10:47
  • @baudsp I am using jdbc plugin to get data from db and not logs. I am not clear how I would be able to use the approach you mentioned. – user977815 Nov 22 '17 at 18:59
  • Well, if the data is pushed in a db, the situations has not changed. Just leave logstash running with the jdbc inputs. – baudsp Nov 23 '17 at 08:58

0 Answers0