We created an index in Elasticsearch as follows, index name is apachelog with dynamic mapping set to "strict", we set the httpresponse field to type integer:
curl -X PUT 'http://localhost:9200/**apachelog**' -d \
'{
"log": {
<b>"dynamic": "strict"</b>,
"properties": {
"@fields": {
"properties": {
"agent": {"type": "string"},
"city": {"type": "string"},
"client_ip": {"type": "string"},
"hitTime": {"type": "string"},
"host": {"type": "string"},
<b>"httpresponse": {"type": "integer"}</b>
}
},
"@message": {"type": "string"},
"@source_host": {"type": "string"},
"@timestamp": {"type": "date", "format": "dateOptionalTime"}
}
}
}'
Our flume ElasticSearch sink is configured as below, notice the index name is apachelog same as the index already created in ES:
Write to ElasticSearch
collector.sinks.elasticsearch.type = org.apache.flume.sink.elasticsearch.ElasticSearchSink
collector.sinks.elasticsearch.channel = mc2
collector.sinks.elasticsearch.batchSize=100
collector.sinks.elasticsearch.hostNames = localhost:9300
collector.sinks.elasticsearch.indexName = apachelog
collector.sinks.elasticsearch.clusterName = logsearch
collector.sinks.elasticsearch.serializer = org.apache.flume.sink.elasticsearch.ElasticSearchLogStashEventSerializer
Now, when we start and run the flume agent, we notice that a new index is created in ElasticSearch with name apachelog-2015-09-09 and the data type for field httpresponse is string. We notice that Flume/ES is adding documents to the newly created index and the index that we created explicitly with name apachelog is dormant.
Any idea why this is happening and how we can get Flume/ES to use our index as opposed to creating its own?