2

I have setup logging like described in https://quarkus.io/guides/centralized-log-management with an ELK Stack using version 7.7.

My logstash pipeline looks like the proposed example:

input {
    gelf {
        port => 12201
    }
}
output {
    stdout {}
    elasticsearch {
        hosts => ["http://elasticsearch:9200"]
    }
}

Most Messages are showing up in my Kibana using logstash.* as an Index pattern. But some Messages are dropped.

2020-05-28 15:30:36,565 INFO [io.quarkus] (Quarkus Main Thread) Quarkus 1.4.2.Final started in 38.335s. Listening on: http://0.0.0.0:8085 The Problem seems to be, that the fields MessageParam0, MessageParam1, MessageParam2 etc. are mapped to the type that first appeared in the logs but actually contain multiple datatypes. The Elasticsearch log shows Errors like ["org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [MessageParam1].

Is there any way in the Quarkus logging-gelf extension to correctly map the values?

Jonas
  • 37
  • 1
  • 7
  • `multiple data types` in Elasticsearch index mapping or the input that comes from `gelp` will have different data types for the same fields for different records? – JBone May 28 '20 at 14:17

1 Answers1

2

ELK can auto-create your Elasticsearch index mapping by looking at the first indexed document. This is a very convenient functionality, but it comes with some drawback.

For example, if you have a field that can contains numbers or strings, if the first document contains a number for this field, the mapping will be created with a number field so you will not be able to index a document containing a String inside this field ...

The only workaround for this is to create the mapping upfront (you can only defines the fields that causing the issue, the other fields will be created automatically).

This is an ELK issue, there is nothing we can do at Quarkus side.

loicmathieu
  • 5,181
  • 26
  • 31