2

I would like to import data from my MySQL database to elasticsearch with logstash. I am already able to import a custom query result, but I am missing the point where I can define the mapping/settings of the index being created by logstash. Also, I have no idea how to import data with one to many relations.

This is my logstash.conf so far:

input {
jdbc {
    jdbc_connection_string => "jdbc:mysql://localhost:3306/shop"
    jdbc_user => "root"
    jdbc_password => ""
    jdbc_driver_library => "C:\Users\curUser\Desktop\mysql-connector-java-5.1.42\mysql-connector-java-5.1.42-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"

    statement => "SELECT * FROM variants var"
    }
}

output {
elasticsearch {
    hosts => "localhost:9200"
    index => "search"
    document_type => "variants"
    document_id => "%{id}"
    }
}

Note: I want to use logstash just to import data to elasticsearch. This is for an online shop, so I have to use some analyzers on the fields

RoyRobsen
  • 457
  • 1
  • 9
  • 20

2 Answers2

0

I've looked through the documentation of the ES plugin for LogStash - not sure if this is possible.

However, you can set the index mapping before running LogStash. You can do so by putting the index mapping with cURL, or maybe there's a way to perform this HTTP request with LogStash

See https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html and https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html

aclowkay
  • 3,577
  • 5
  • 35
  • 66
  • I have already read the documentation and came to the same result. If so, I know how to put the index mapping with curl, but what about the nested fields? Do you have any idea? Simple "inner join" does not work – RoyRobsen Aug 03 '17 at 12:08
  • Then you could use something like ` filter { mutate { rename => { "columnToNest" => "[level1][level2]" } } } ` – aclowkay Aug 03 '17 at 12:33
0
  1. For mapping/settings, you can use the Elasticsearch rest API
  2. Once mapping/settings are created, use the same index name in logstash.conf file
  3. If your table (SELECT * FROM variants var) has many columns and you don't want to import all of them, remove unwanted columns with a mutator:

    filter { mutate {  remove_field => ["@version", "@timestamp", "column"]}}
    
  4. If you want to rename a column use:

    filter { mutate {  rename => { "id" => "ID" }}}
    
  5. Join query can be added like you added the select query.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Biplab
  • 1