6

I want to have an output for Influx DB from Logstash, is there any such plugin available?

The output is set to graphite.. This is the influx config:

[input_plugins]

# Configure the graphite api
[input_plugins.graphite]
enabled = true
port = 2003
database = "AirAnalytics"  # store graphite data in this database
# udp_enabled = true  # enable udp interface on the same port as the tcp interface

This is the logstash config:

output {
    stdout {}
    graphite {
            host => "localhost"
            port => 2003
    }
}

I see the output in the console (stdout) but no other message and nothing gets posted into influx. I checked the influx logs as well, nothing.

I tried posting the same message directly via http to influx and it works, so there's no issue with the message or influx install.

Chris Martin
  • 30,334
  • 10
  • 78
  • 137
Sumit Maingi
  • 2,173
  • 3
  • 24
  • 44

5 Answers5

7

Solved it. I needed to pass on the already prepared influx compatible string to influx via logstash.

Following is the logstash configuration snippet which did the trick:

output {
    http {
            url => "http://localhost:8086/db/<influx db name>/series?u=<user name>&p=<pwd>"
            format => "message"
            content_type => "application/json"
            http_method => "post"
            message => "%{message}"
            verify_ssl => false
    }
    stdout {}
}

Note: If you use the format "json" then logstash wraps the body around a "message" field which was causing a problem.

Sumit Maingi
  • 2,173
  • 3
  • 24
  • 44
  • this is inefficient; it does a post to influxdb PER message, instead of batching them where possible. – Dexter Legaspi Apr 26 '15 at 12:07
  • We had a ready string as our application produces GBs of such data, the app then basically collects data for 5 seconds then combines the rows of the same table which optimizes the packet size as well as increases influx's write throughput. The resultant string is also compressed (gzip, fast) which is decompressed via logstash by a custom filter that we have. – Sumit Maingi Apr 27 '15 at 08:52
1

It's available via logstash-contrib as an output: https://github.com/elasticsearch/logstash-contrib/blob/master/lib/logstash/outputs/influxdb.rb

Paul Dix
  • 1,967
  • 16
  • 8
1

There is an influxdb output in logstash-contrib, however, this was added after 1.4.2 was released.

With logstash 1.5, there is a new plugin management system. If you're using 1.5, you can install the influxdb output with:

# assuming you're in the logstash directory
$ ./bin/plugin install logstash-output-influxdb
Wilfred Hughes
  • 29,846
  • 15
  • 139
  • 192
0

Maybe this help:

http://influxdb.com/docs/v0.8/api/reading_and_writing_data.html

Look at the section: Writing data through Graphite Protocol maybe you can use the graphite output of logstash.

I think I am going to try that this weekend.

  • Sounds promising, good idea, I'll give it a shot and will post my findings – Sumit Maingi Sep 20 '14 at 17:45
  • I have started to build a simple logstash output for InfluxDB. It is almost finished. I want to perform some more tests before publishing it on Github. This week it will be finished and published. – Peter Paul Holzhauer Sep 21 '14 at 17:25
  • Take a look at the code at GitHub. [PeterPaulH/logstash-influxdb](https://github.com/PeterPaulH/logstash-influxdb) – Peter Paul Holzhauer Sep 21 '14 at 18:41
  • I saw the plugin, although i have the entire influx string which is proper, all i want to do is to push it to influx via logstash. I am trying out the graphite way right now, getting a wierd error saying "cannot convert string into integer" from the logstash graphite output plugin. Looking into it still, I will update the answer once I'm done. – Sumit Maingi Sep 25 '14 at 06:28
0

The accepted answer, while it works, is not very flexible because:

  • It requires the actual JSON payload to be in %{message} or whatever logstash variable you end up using
  • it doesn't submit the data points in batch where possible (of course, unless you have it in the JSON payload...which...in such case...why are you even using logstash in the first place?)

As noted by Paul and Wilfred, there is support for influxdb written by Jordan Sissel himself, but it was released after 1.4.2...good thing is that it works with 1.4.2 (i've tried it myself)...all you need to do is copy the influxdb.rb file to the /lib/logstash/outputs and configure your logstash accordingly. As for the documentation, you can find it here ...it did take me a bit more effort to find it because googling "influxdb logstash" doesn't take have this link on the first page results.

Dexter Legaspi
  • 3,192
  • 1
  • 35
  • 26
  • You are right, not flexible, logstash is used for other reasons beyond extracting data from raw format, we had a ready string as our application produces GBs of such data, the app then basically collects data for 5 seconds then combines the rows of the same table which optimizes the packet size as well as increases influx's write throughput. The resultant string is also compressed (gzip, fast) which is decompressed via logstash by a custom filter that we have. – Sumit Maingi Apr 27 '15 at 08:48