I'm trying to forward logs to elastic-search and got stuck with setting the index dynamically (by field in the input data).
My input data format is JSON and always have the key "es_idx". I wish to forward to elasticsearch by that key and add it timestamp, I use logstash_format
true to achieve the timestamp feature and logstash_prefix
to set the index name other than "fluentd"
This is how my fluentd config looks like:
# fluentd/conf/fluent.conf
<source>
type stdin
# Input pattern. It depends on Parser plugin
format json
# Optional. default is stdin.events
</source>
<match *.**>
@type copy
<store>
@type stdout
</store>
<store>
@type elasticsearch
host <es-host>
port<es-port>
logstash_format true
logstash_prefix ${$record["es_idx"]}
type_name fluentd
flush_interval 5s
</store>
</match>
When using the following input {"tenant_id":"test","es_idx":"blabla"}, I'm getting the following error:
2020-05-27 10:38:06 +0300 [warn]: #0 dump an error event: error_class=Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError error="400 - Rejected by Elasticsearch" location=nil tag="stdin.events" time=2020-05-27 10:37:59.498450000 +0300 record={"tenant_id"=>"test", "es_idx"=>"blabla"}
If I'm setting the logstash_pattern to other string like this: "logstash_pattern blabla" it works fine.
Does anyone have a clue what can be the issue?